JacquesC

Prof. Jacques Carette

2401 Reputation

17 Badges

20 years, 88 days
McMaster University
Professor or university staff
Hamilton, Ontario, Canada

Social Networks and Content at Maplesoft.com

From a Maple perspective: I first started using it in 1985 (it was Maple 4.0, but I still have a Maple 3.3 manual!). Worked as a Maple tutor in 1987. Joined the company in 1991 as the sole GUI developer and wrote the first Windows version of Maple (for Windows 3.0). Founded the Math group in 1992. Worked remotely from France (still in Math, hosted by the ALGO project) from fall 1993 to summer 1996 where I did my PhD in complex dynamics in Orsay. Soon after I returned to Ontario, I became the Manager of the Math Group, which I grew from 2 people to 12 in 2.5 years. Got "promoted" into project management (for Maple 6, the last of the releases which allowed a lot of backward incompatibilities, aka the last time that design mistakes from the past were allowed to be fixed), and then moved on to an ill-fated web project (it was 1999 after all). After that, worked on coordinating the output from the (many!) research labs Maplesoft then worked with, as well as some Maple design and coding (inert form, the box model for Maplets, some aspects of MathML, context menus, a prototype compiler, and more), as well as some of the initial work on MapleNet. In 2002, an opportunity came up for a faculty position, which I took. After many years of being confronted with Maple weaknesses, I got a number of ideas of how I would go about 'doing better' -- but these ideas required a radical change of architecture, which I could not do within Maplesoft. I have been working on producing a 'better' system ever since.

MaplePrimes Activity


These are replies submitted by JacquesC

It has been a very long time since I have jotted down notes on movements in the rankings. Many (too many to list) people have hit a lot of milestones. The number of people with red Maple leafs is quite large now. But what prompted me to write this note is that with Dave L's post today, the silver leafs have now spilled onto the 2nd page! This is a real, quantitative comment on the fact that this site continues to be useful to the Maple community.
Using showstat(trigsubs), you'll see that it uses an internal table, where FunctionAdvisor has different ways of getting at information.
Using showstat(trigsubs), you'll see that it uses an internal table, where FunctionAdvisor has different ways of getting at information.
is that the author of FunctionAdvisor did not even know the existence of trigsubs. I myself had completely forgotten its existence, if I even knew it! From what I can tell, this routine was first written in 1984 (yes, that's right, 23 years ago), and is not really 'tied in' to the rest of Maple.
is that the author of FunctionAdvisor did not even know the existence of trigsubs. I myself had completely forgotten its existence, if I even knew it! From what I can tell, this routine was first written in 1984 (yes, that's right, 23 years ago), and is not really 'tied in' to the rest of Maple.
'type' is for types, and 'assume' is for properties. In Maple, there used to be only types at one point, so the question "what is a type" was never really asked. But in retrospect, the vast majority of types in Maple are ``syntactic'', in other words, they test something about the representation of an object. Of course, since types can be arbitrary computations in Maple, some are also semantic (like checking that an integer is prime). But the core types are syntactic, especially the so-called "structured types", which are really the work-horses of the dynamic type system of Maple. Properties on the other hand are most definitely semantic. This is why one can assume that certain symbols have certain properties. 'is' then checks those properties, so in effect is a mini-automated-theorem-prover embedded in Maple (but as a pure black box). One design decision was that all types were to be properties too. This is both really useful and (in hindsight) a serious mistake. To have both syntax and semantics together is very dangerous -- and doing this badly is the source of many of the paradoxes in logic, and incidentally includes Frege's original mistake in his logic (ie the Russel Paradox). It is not impossible to get a consistent logic that deals with both syntax and semantics, but it is remarkably hard to do so -- see Chiron: a multi-paradigm logic for one example. As Robert Israel pointed out, a lot of types are not in fact known to assume, so 'is' cannot reason about them at all. So one needs to be careful to only use those names which are actually known to assume -- a set which is actually not documented! Just to make things more interesting, the type-formation operators (like and, or, not) are different from the property operators (And, Or, Non). Furthermore, the type logic is 2-valued [FAIL is not allowed in types] and the property logic is 3-valued!
is about 1 month behind answering some of my personal emails to him, so it is not so surprising that he hasn't looked at those threads himself, although he has posted to primes before. It is more surprising that no one else seems to know enough about this to reply! [Unfortunately, I don't].
Replacing seq by an explicit loop is "too obvious"; adding 1 element at a time to a list is notoriously inefficient, which is why I used that intermediate table in my own solution. I meant to have a version where the looping is 'implicit' rather than explicit. It's ok to use map.
Replacing seq by an explicit loop is "too obvious"; adding 1 element at a time to a list is notoriously inefficient, which is why I used that intermediate table in my own solution. I meant to have a version where the looping is 'implicit' rather than explicit. It's ok to use map.
We intend to release (a version of) MapleMIX soon. Right now, it works for about 1/3 of the problems we throw at it, requires a minor bug fix for 1/3, and a major fix for the remaining 1/3. Good enough to call it a functional prototype, nearing alpha stage. But from most user's point of view, the software fails 2/3 of the time -- clearly useless! If that does not daunt you, drop me an email and I can give you a functional version of MapleMIX to play with.
We intend to release (a version of) MapleMIX soon. Right now, it works for about 1/3 of the problems we throw at it, requires a minor bug fix for 1/3, and a major fix for the remaining 1/3. Good enough to call it a functional prototype, nearing alpha stage. But from most user's point of view, the software fails 2/3 of the time -- clearly useless! If that does not daunt you, drop me an email and I can give you a functional version of MapleMIX to play with.
You are right, I was doing one large call rather than a lot of small calls. Yes, at small sizes, the overhead of LinearAlgebra dispatching can be murderous. And we've seen the same thing with Statistics dispatching too. I am working on that [actually, a better read is probably the recently submitted Journal version]. I am hoping to do a blog post on that in the next couple of weeks. We are still looking into a couple of problems relating to specializing some LinearAlgebra code. And I hope that Dave L has see your well-justified request for expanded access to f06zaf!
You are right, I was doing one large call rather than a lot of small calls. Yes, at small sizes, the overhead of LinearAlgebra dispatching can be murderous. And we've seen the same thing with Statistics dispatching too. I am working on that [actually, a better read is probably the recently submitted Journal version]. I am hoping to do a blog post on that in the next couple of weeks. We are still looking into a couple of problems relating to specializing some LinearAlgebra code. And I hope that Dave L has see your well-justified request for expanded access to f06zaf!
For N=10, if one splits the HermitianTranspose and the matrix product, then 9% is in HermitianTranspose, 91% in the product. There also seemed to be a bug in the above, where 'temp' was used where it seemed like 'rtemp' was meant. So if one changes line 6 to instead be rho := LinearAlgebra[HermitianTranspose](rtemp); rho := LinearAlgebra[LA_Main][MatrixMatrixMultiply](rho,rtemp,inplace=false,outputoptions=[]); one gets a further speedup. Profiling that down, one sees that now the whole time is spent in the external call, so that the speed appears 'optimal' (inasmuch as the external routine is the right one and coded properly).
For N=10, if one splits the HermitianTranspose and the matrix product, then 9% is in HermitianTranspose, 91% in the product. There also seemed to be a bug in the above, where 'temp' was used where it seemed like 'rtemp' was meant. So if one changes line 6 to instead be rho := LinearAlgebra[HermitianTranspose](rtemp); rho := LinearAlgebra[LA_Main][MatrixMatrixMultiply](rho,rtemp,inplace=false,outputoptions=[]); one gets a further speedup. Profiling that down, one sees that now the whole time is spent in the external call, so that the speed appears 'optimal' (inasmuch as the external routine is the right one and coded properly).
First 47 48 49 50 51 52 53 Last Page 49 of 119