JacquesC

Prof. Jacques Carette

2401 Reputation

17 Badges

20 years, 86 days
McMaster University
Professor or university staff
Hamilton, Ontario, Canada

Social Networks and Content at Maplesoft.com

From a Maple perspective: I first started using it in 1985 (it was Maple 4.0, but I still have a Maple 3.3 manual!). Worked as a Maple tutor in 1987. Joined the company in 1991 as the sole GUI developer and wrote the first Windows version of Maple (for Windows 3.0). Founded the Math group in 1992. Worked remotely from France (still in Math, hosted by the ALGO project) from fall 1993 to summer 1996 where I did my PhD in complex dynamics in Orsay. Soon after I returned to Ontario, I became the Manager of the Math Group, which I grew from 2 people to 12 in 2.5 years. Got "promoted" into project management (for Maple 6, the last of the releases which allowed a lot of backward incompatibilities, aka the last time that design mistakes from the past were allowed to be fixed), and then moved on to an ill-fated web project (it was 1999 after all). After that, worked on coordinating the output from the (many!) research labs Maplesoft then worked with, as well as some Maple design and coding (inert form, the box model for Maplets, some aspects of MathML, context menus, a prototype compiler, and more), as well as some of the initial work on MapleNet. In 2002, an opportunity came up for a faculty position, which I took. After many years of being confronted with Maple weaknesses, I got a number of ideas of how I would go about 'doing better' -- but these ideas required a radical change of architecture, which I could not do within Maplesoft. I have been working on producing a 'better' system ever since.

MaplePrimes Activity


These are replies submitted by JacquesC

As I have said before, the Maple documentation tries really hard not to scare the users by using big words or fancy concepts that might be new.  It uses functional ideas throughout, but it doesn't admit it in so many words. 

Unlike the Mathematica documentation.  Since the Mathematica marketing is that Mathematica is ground-breaking in pretty much everything it does and its users are better off using these brilliant new ideas rather than what they have been taught before, it boldly forges ahead with wild claims.

Frankly, I think both these extremes are awful.  Amusingly, they both insult users' intelligence, but by completetely different means.

As I have said before, the Maple documentation tries really hard not to scare the users by using big words or fancy concepts that might be new.  It uses functional ideas throughout, but it doesn't admit it in so many words. 

Unlike the Mathematica documentation.  Since the Mathematica marketing is that Mathematica is ground-breaking in pretty much everything it does and its users are better off using these brilliant new ideas rather than what they have been taught before, it boldly forges ahead with wild claims.

Frankly, I think both these extremes are awful.  Amusingly, they both insult users' intelligence, but by completetely different means.

I was surprised to find myself agreeing with much of what was said above.  Nothing to do with acer, but rather that my instinct was to hold the opposite opinion to what was stated above.  But acer convinced me that I wasn't so much wrong as suffering from a lack of perspective.

Basically the point is that >I< want a review like Wester's.  That is because that fits my own usage pattern of a CAS!  Of course, there are few people outside CAS builders and 'generalists' who have this particular use pattern.  There are, whoever, definite use patterns amongst different communities that can be fruitfully used to generate good reviews that would be of use for each of those communities.  Note that I agree with jakubi that one of the best ways to get these reviews done, especially to help with the aspect of getting representative experts involved, is to use a wiki.

Having said that, somewhere inside a review like Wester's, a really valuable review is hiding.  Pretty much all uses of a CAS rely on a core set of features.  Now, users of a CAS may not in fact realize that they are constantly using finite fields, polynomial rings or series expansions for a huge number of their (symbolic) computations, but they are.  So there is a point to creating a focused review of a CAS's foundational capabilities.  This lets you know how solid the base of the system is.  Generally speaking, if you take the mathematics covered in all university courses (not just in math courses, but all math used) in first and second year as well as some selected 3rd year courses, that (indirectly) covers a very large part of the foundations.  Basically, if a system cannot get that right, what hope is there for the more advanced features being right?  If one were to redo a Wester-like review but focused on this aspect of things, I believe that that would be quite valuable.  Perhaps not as a tool for deciding which product to purchase, but certainly as a tool to understand which product seems to be built on the most solid foundations.

I was surprised to find myself agreeing with much of what was said above.  Nothing to do with acer, but rather that my instinct was to hold the opposite opinion to what was stated above.  But acer convinced me that I wasn't so much wrong as suffering from a lack of perspective.

Basically the point is that >I< want a review like Wester's.  That is because that fits my own usage pattern of a CAS!  Of course, there are few people outside CAS builders and 'generalists' who have this particular use pattern.  There are, whoever, definite use patterns amongst different communities that can be fruitfully used to generate good reviews that would be of use for each of those communities.  Note that I agree with jakubi that one of the best ways to get these reviews done, especially to help with the aspect of getting representative experts involved, is to use a wiki.

Having said that, somewhere inside a review like Wester's, a really valuable review is hiding.  Pretty much all uses of a CAS rely on a core set of features.  Now, users of a CAS may not in fact realize that they are constantly using finite fields, polynomial rings or series expansions for a huge number of their (symbolic) computations, but they are.  So there is a point to creating a focused review of a CAS's foundational capabilities.  This lets you know how solid the base of the system is.  Generally speaking, if you take the mathematics covered in all university courses (not just in math courses, but all math used) in first and second year as well as some selected 3rd year courses, that (indirectly) covers a very large part of the foundations.  Basically, if a system cannot get that right, what hope is there for the more advanced features being right?  If one were to redo a Wester-like review but focused on this aspect of things, I believe that that would be quite valuable.  Perhaps not as a tool for deciding which product to purchase, but certainly as a tool to understand which product seems to be built on the most solid foundations.

Usually the lambda calculus does not treat open terms, ie terms with unbound terms in them.  Only closed terms are given a semantic, usually.

Unbound names occuring in functions are usually treated as global.  [The only exception being when one explicitly inserts an escaped local from another scope into a function].  Since the usual lambda calculus does not possess mutable variables, never mind global 'variables', there is no need for trying to figure out what they might mean.

Or at least try to predict the future via reading the press releases :-) [For those not familiar with the expression, see reading tea leaves on Wikipedia].

It seems that the target market is mostly engineers now.  They (engineers) seem to care much less about special functions.  Numerical integration is still an issue, but (numerical) DE solving is even more important, as far as I can tell.

Note: why do engineers not care about special functions?  Because they have been taught to work numerically, that's why.  If they tried to work symbolically on some of their models, they would start to care about special functions as much as the physicists!  It is always interesting to me how an education can sometimes act as a barrier to progress.

Or at least try to predict the future via reading the press releases :-) [For those not familiar with the expression, see reading tea leaves on Wikipedia].

It seems that the target market is mostly engineers now.  They (engineers) seem to care much less about special functions.  Numerical integration is still an issue, but (numerical) DE solving is even more important, as far as I can tell.

Note: why do engineers not care about special functions?  Because they have been taught to work numerically, that's why.  If they tried to work symbolically on some of their models, they would start to care about special functions as much as the physicists!  It is always interesting to me how an education can sometimes act as a barrier to progress.

This is interesting.  Maple knows exactly what the answer is, but has no mechanism for wiriting down the answer, so throws it out!  If p denotes your piecewise, then

> solve({op(2,p)},{t}, AllSolutions);

                                          -849     4245
              {t = 5 LambertW(_NN1~, -exp(----)) + ----}
                                           49       49

> solve({op(4,p)},{t}, AllSolutions);

                  td
  {t = -30 exp(- ----) + 5/7 LambertW(_NN2~,
                  5

        /             td  \     49 td                   td
        |6 - 7 exp(- ----)| exp(----- - 843 + 42 exp(- ----))) - 6 td
        \             5   /       5                     5

         + 4215/7}

The issue is that Maple does not seem to want to represent piecewise-defined answers. This is not entirely surprising, as I am not sure anyone has ever really worked out the theory for that.  It doesn't mean that that is necessarily difficult, it just might not have been done yet.

This is interesting.  Maple knows exactly what the answer is, but has no mechanism for wiriting down the answer, so throws it out!  If p denotes your piecewise, then

> solve({op(2,p)},{t}, AllSolutions);

                                          -849     4245
              {t = 5 LambertW(_NN1~, -exp(----)) + ----}
                                           49       49

> solve({op(4,p)},{t}, AllSolutions);

                  td
  {t = -30 exp(- ----) + 5/7 LambertW(_NN2~,
                  5

        /             td  \     49 td                   td
        |6 - 7 exp(- ----)| exp(----- - 843 + 42 exp(- ----))) - 6 td
        \             5   /       5                     5

         + 4215/7}

The issue is that Maple does not seem to want to represent piecewise-defined answers. This is not entirely surprising, as I am not sure anyone has ever really worked out the theory for that.  It doesn't mean that that is necessarily difficult, it just might not have been done yet.

For one, you still have floats, not rationals in there.  Plus, the piecewise you gave is most certainly "backwards", since it is 0 everywhere *except* on the line t=td.  Is that really what you want?  I would have expected the opposite!  Additionally, if you want t in terms of td, you should have

solve(..., {t});

otherwise it will solve for both variables.

Note the following:

> p := convert(piecewise(t=td,-.007*t+.03*exp(1.4*td-1.4*t)-.035*exp(1.2*td-1.4*t)-.042*td+4.215-.21*exp(-.2*td)), rational):
> solve({p},{t}, AllSolutions) assuming t=td;

                  td
  {t = -30 exp(- ----) + 5/7 LambertW(_NN1~,
                  5

        /             td  \     49 td                   td
        |6 - 7 exp(- ----)| exp(----- - 843 + 42 exp(- ----))) - 6 td
        \             5   /       5                     5

         + 4215/7}

For one, you still have floats, not rationals in there.  Plus, the piecewise you gave is most certainly "backwards", since it is 0 everywhere *except* on the line t=td.  Is that really what you want?  I would have expected the opposite!  Additionally, if you want t in terms of td, you should have

solve(..., {t});

otherwise it will solve for both variables.

Note the following:

> p := convert(piecewise(t=td,-.007*t+.03*exp(1.4*td-1.4*t)-.035*exp(1.2*td-1.4*t)-.042*td+4.215-.21*exp(-.2*td)), rational):
> solve({p},{t}, AllSolutions) assuming t=td;

                  td
  {t = -30 exp(- ----) + 5/7 LambertW(_NN1~,
                  5

        /             td  \     49 td                   td
        |6 - 7 exp(- ----)| exp(----- - 843 + 42 exp(- ----))) - 6 td
        \             5   /       5                     5

         + 4215/7}

I am really surprised by how badly Maple is performing here.  Fast and accurate numerical evaluation of even very complex mathematical functions is one of the unsung strengths of Maple.  [Mathematica's basic arithmetic is way faster, but Maple's implementations for special functions are much more accurate and yet still fast].  Especially since I know that a huge amount of effort has been spent on the hypergeometric family.  One of the standard tricks is to use different algorithms over different regions (for both accuracy and speed), so that for this function I fully expected the asymptotic expansion to be used.  I guess that 1F1 is still on the 'to do' pile!

I am really surprised by how badly Maple is performing here.  Fast and accurate numerical evaluation of even very complex mathematical functions is one of the unsung strengths of Maple.  [Mathematica's basic arithmetic is way faster, but Maple's implementations for special functions are much more accurate and yet still fast].  Especially since I know that a huge amount of effort has been spent on the hypergeometric family.  One of the standard tricks is to use different algorithms over different regions (for both accuracy and speed), so that for this function I fully expected the asymptotic expansion to be used.  I guess that 1F1 is still on the 'to do' pile!

The format for solve is best given as solve({eqns}, {vars});. You are skipping telling Maple which variable(s) to solve for. Also, unless you have no choice, you are much better off using exact rationals than floating point numbers when calling solve.

The format for solve is best given as solve({eqns}, {vars});. You are skipping telling Maple which variable(s) to solve for. Also, unless you have no choice, you are much better off using exact rationals than floating point numbers when calling solve.

First 33 34 35 36 37 38 39 Last Page 35 of 119