JacquesC

Prof. Jacques Carette

2396 Reputation

17 Badges

19 years, 120 days
McMaster University
Professor or university staff
Hamilton, Ontario, Canada

Social Networks and Content at Maplesoft.com

From a Maple perspective: I first started using it in 1985 (it was Maple 4.0, but I still have a Maple 3.3 manual!). Worked as a Maple tutor in 1987. Joined the company in 1991 as the sole GUI developer and wrote the first Windows version of Maple (for Windows 3.0). Founded the Math group in 1992. Worked remotely from France (still in Math, hosted by the ALGO project) from fall 1993 to summer 1996 where I did my PhD in complex dynamics in Orsay. Soon after I returned to Ontario, I became the Manager of the Math Group, which I grew from 2 people to 12 in 2.5 years. Got "promoted" into project management (for Maple 6, the last of the releases which allowed a lot of backward incompatibilities, aka the last time that design mistakes from the past were allowed to be fixed), and then moved on to an ill-fated web project (it was 1999 after all). After that, worked on coordinating the output from the (many!) research labs Maplesoft then worked with, as well as some Maple design and coding (inert form, the box model for Maplets, some aspects of MathML, context menus, a prototype compiler, and more), as well as some of the initial work on MapleNet. In 2002, an opportunity came up for a faculty position, which I took. After many years of being confronted with Maple weaknesses, I got a number of ideas of how I would go about 'doing better' -- but these ideas required a radical change of architecture, which I could not do within Maplesoft. I have been working on producing a 'better' system ever since.

MaplePrimes Activity


These are replies submitted by JacquesC

I am flattered to be nominated, but you guys should carefully looked at what the winner will get.  The winner is supposed to go talk to Maplesoft developers (for a week) and the subtext is that they will listen to the concerns and advice of an esteemed member of the MaplePrimes community.

But that isn't really true.  There are some members of the MaplePrimes community (yours truly included) who have too much history with Maplesoft to offer advice which would be adequately and objectively evaluated.  The nominee needs to be someone whose advice stands a chance of making an impact.  I will let other nominees comment themselves on whether they think they could serve in this role.

This part of the low-level design of 'assume' was known to be a hack, but it was the best hack that mostly-worked available at the time (Maple's assume facility is now rather old).  Maple has evolved since, and newer features (like attributes) would have been a much better low-level implementation strategy for assume.  But that would have introduced all sorts of subtle backwards compatibility problems, even though it generally would have been an improvement.

This is the cost one must bear for total backwards compatibility: old design bugs can rarely be fixed.  This causes a lot of inertia for old software, and over time it can lead to serious sclerosis.  In other areas, this has allowed newer competitors a foothold even though the start-up cost is tremendous [think operating systems as the most obvious example].

 

If the Watcom compiler, which comes with Maple 12, is known to not work properly in directories with spaces in the name:

1. Why does Maple install in a directory with a space, thereby neutering one of its own features?

2. Why wasn't this caught before shipping?

3. Why is the default directory for the compiler one which will cause failure?

There must be something more to it than that!  [The people at Maplesoft are more competent than to let such an obvious bug ship, we have to believe that.]

I just tried this (in Maple 11 Classic), and it works fine, even with fss as a list.  It should work.

I would say that this is a bug in whatever version of Maple you are using.  Are you maybe using Standard in 2D math mode?

As you've pointed out, it all comes down to 'usefulness', and more precisely, how useful is an evalat function which is sometimes silently wrong?  Of course, if the problem was that simple, it would have been fixed long ago.  The other question is, how is useful is an evalat function which is really really slow?

Personally, I am greatly inspired by one of the principal design decisions behind the Chez Scheme compiler, namely that "Lesser used operations must pay their own way" (see Kent Dybvig's wonderful paper on the historical development of Chez Scheme, or if you're in a hurry, the slides).  Another way to say this is that lesser used operations must not have any cost when they are not used.

In Maple, this is frequently not the case - new features make older (and much more common) features slower.  So adding new (core) features is a non-trivial decision.  If many routines, most notably evalat, were structured such that the cost of execution was directly proportional to the features used in the actual call [rather than being proportional to all the features that the function supports], then this whole problem would go away!  evalat would be as expensive (or cheap) as necessary to get the job done. 

In my mind, the system should work as hard as zero-testing as the expression at hand requires.  What the system should most definitely not do is to always work as hard at zero-testing as the potentially most complicated expression expressible in Maple requires.  And for too many Maple routines, one has to pay the cost of the full power, instead of the sliding cost of "just enough" power.

If you have 30 linear constraints in 27 functions, either 3 of them are redundant or your system is inconsistent.  Note that redundant constraints would also cause most numerical methods difficulties.

I am very puzzled to as to why the TTY version has a larger initial set of protected names!  I don't have an explanation for that either.

As far as I can tell, this method can not give all the protected names if there is a protected name which isn't reachable from the root set given by the initial names.  It is possible that that set is empty, though it would take a fair bit of work to determine that.

No, the index is not the order.  For a DAE, it rougly corresponds to the (minimum) number of derivatives you have to take of your system before you are able to transform it to a (larger) system of ODEs.  It measures how 'implicit' your system is, and how difficult the problem of finding good initial starting points is.

What platform are you on?  Maple might be trying to write into a directory where indeed you don't have write access to, but the directories used rather depend on the platform (and you user priviledges).

Adding new names to the kernel was always regarded as really bad, so it became customary to overload concepts a lot, especially if there was a plausible reasoning that could be applied.  So Complex creates both general complex numbers and 'special' complex numbers,namely the ones which are purely imaginary, through different calling conventions.

When the number of routines in C went above several hundred (once you factor in all the external libraries), this kind of overloading starts to seem silly.  But this particular routine was done at the same time as the kernel grew significantly, so it was really done in the old style.

Stupid <maple> tag.  The missing equation says a^h = exp(h*ln(a)).

Stupid <maple> tag.  The missing equation says a^h = exp(h*ln(a)).

Using define_external, you can call Fortran routines from Maple.

at least for large polynomials.  PolynomialTools[CoefficientVector] is "better" in that it can handle sparse polynomials, which CoefficientList will necessarily make sense.  PolynomialTools[CoefficientVector] is O(#terms), with subsequent extraction O(1), while coeff is O(#terms) [so that extracting all coefficients is O(#terms^2) ].

I am rather happy with that code -- that was one of the last routines which I added to Maple.  Although some of the credit should go to Bruno Salvy, who first pointed out the frequent Maple anti-pattern of using coeff in a loop, and a better solution, which became the heart of PolynomialTools[CoefficientVector].

at least for large polynomials.  PolynomialTools[CoefficientVector] is "better" in that it can handle sparse polynomials, which CoefficientList will necessarily make sense.  PolynomialTools[CoefficientVector] is O(#terms), with subsequent extraction O(1), while coeff is O(#terms) [so that extracting all coefficients is O(#terms^2) ].

I am rather happy with that code -- that was one of the last routines which I added to Maple.  Although some of the credit should go to Bruno Salvy, who first pointed out the frequent Maple anti-pattern of using coeff in a loop, and a better solution, which became the heart of PolynomialTools[CoefficientVector].

First 10 11 12 13 14 15 16 Last Page 12 of 119