Mac Dude

1571 Reputation

17 Badges

13 years, 89 days

MaplePrimes Activity


These are replies submitted by Mac Dude

I have one called MAPLE An Introduction and Reference by Michael Kofler (Addison Wesley). Like spradlig's mention, it is much more than a rehash of the Manual and shows the limits and quirks of the system. Quite useful, up to a point.

The problem with all these is that they were written for Maple V. While the claim is made, at least by Rimrock, that Maple hasn't changed much under the hood, this is only partially true. vector and matrix have been superceded, modules and records are new and there are a host of new packages.

I would rather strongly recommend reading Maplesoft's documentation, esp. the User Manual and the Programming Guide (free to d/l). At least then you start with the more modern constructs. The older books then help you deal with the quirks and idiosyncrasies of Maple.

Mac Dude.

@Alejandro Jakubi I agree with all you are saying, except you are not mentioning the alternative, which is to bail out to "that other software." I am neither advocating that, nor do I have any clue whether Mma is any better in this regard as I am not hanging out on it's websites and have not used it for may years. But I do perceive loss of users for any reason, valid or not, as a risk for Maple and, as they say, the grass usually looks greener on the other side. Maplesoft is doing the right thing by asking users for their opinion; they are not being helped by us not pointing out where they can improve.

It is Maplesofts job to listen. I know from many posts here that people point out issues. I actually just checked upon my favorite graphics plot (the bug in plots:-display with log plots plotting linear when gridlines are on): and it is still not fixed. And this one I did SCR and even had  Maplesoft employee verify as a bug. Why should I SCR anything if something that basic does not get attention 2 versions after it was reported (I SCR'd Maple 15 for it)?

Yours very frustrated,

Mac Dude

 

@Jimmy The statistical indicator for goodness of fit is Chi^2/n, which is the sum of the squared residuals i.e. the sum of the squared differences between the data points and the fitted function evaluated at these points, divided by the degree of freedom. For practical purposes, the deg of freedom is the no. of data points minus the no. of free parameters being fitted. The division by n prevents the mere addition of parameters to seemingly give a better fit when the additional parameters are statistically insignificant (more parameters almost always reduce Chi^2, but not necessarily Chi^2/n).

Standard error or deviation are associated with the fitted parameters; they are related to Chi^2/n but not in a simple way and also strongly dependent on the form of your function, no. of free parameters etc.

M.D.

I used Mma many years (V2.2) and then had a long hiatus where my work did not require me of leave time to use CAS languages. Then a couple of years ago I found a need and decided to take advantage of the site license our lab has. I should also mention that I use Maple in  scientific context, mostly to simulate systems, do theoretical calculations, and analyse data. Education is not my main field although I am working on a course I will be giving next year; that course I am building with and around Maple.

Here are, without much prose, my reasons and thoughts:

Trivial reasons:
Availability as a site license at our lab, and licensing in general. Once installed & activated, I have it on a machine, no muss, no fuss, no license servers etc.
Relatively long support of PowerMacs. I have been and am still running some PPC Macs. Maple 15 runs fine on these. May become an issue again when Maple 18 comes out when support for Snow Leo is supposed to end. I hope it does not. I won't be able to switch to 10.7 anytime soon (obviously on Intel Macs).

Maple-capabilities related reasons:
I find modules and records to be an incredibly powerful feature set, allowing me to do "OOP lite" but complete enough to take very much advantage of.  Case in point: I recently needed to work out the magnetic field of rectangular Helmholtz coils. Building a Coils module with a "constructor" that creates a Coil record, a field-evaluation function and a couple of functions to rotate and translate a Coil instance I was able to, in about a couple of days, build a system that lets me model all sorts of weird coils, in fact leading to design optimizations I wouldn't have attempted otherwise. Unbeatable! I am also working on a package to do particle optics, where I use records to represent optical elements and overload some of the LinearAlgebra operations to string them together to build beam lines.

The close connection of Maple to Matlab is a plus in our lab, where Matlab is in widespread use.

I personally find the Maple language to be easier to learn and memorize than others, although I know that a number of people will disagree on that. Constructs like seq() and friends allow for a terse, functional programming style. I do not know whether other CAS have similar features, though.

I find the Physics: package intriguing and am working with it.

What could make me leave Maple for "that other system" (and this is a corallary question you need to ask):
Too many rough edges for my taste. Weirdnesses and outright bugs persist. E.g. plots:-display fails on logplots when gridlines are on (15 and I believe 17 as well). E.g. loading Threads: and using it gets the Maple kernel into a weird state with a hang upon restart (regression; was ok in 15, broken in 17). Others I forget but they are there.

Many commands are not "orthogonal", meaning they work only on certain data constructs (like on lists but not Vectors etc.) forcing unnecessary convert statements. I know Maple 17 is better in that regard than 15 and I appreciate that.

My biggest wish:
Fix those bugs! Maple is way too nice and powerful a package to let it not look polished. I would want many more updates between major versions, fixing things as the reports come in. There is no excuse to not do that. Avoid regressions at any cost.

(Remark here: I know this may be perceived as too general a statement. But the reality is that I am not paid to debug Maple but to produce results. So I am finding it difficult to make time to collect and catalog and SCR all the issues I am running across. In many cases one can find a work-around but I do find that need rather annoying. I am also rather keenly aware that some "bugs" are really the idiosyncrasies of a CAS and not really bugs. It is not a trivial situation. But I do maintain that, as I am getting deeper into it, Maple seems rougher than necessary or acceptable.)

Mac Dude

 

@acer You are right. Now that I reread your answer, I see your qualifications and advice loud & clear.

M.D.

 

@acer You are right. Now that I reread your answer, I see your qualifications and advice loud & clear.

M.D.

 

@acer: While your suggestions about warnlevel are certainly correct in a technical sense; I do wonder whether we really want to advocate turning warnings off. They can often indicate coding errors etc. and hint where one is going off-trail. Even in the case of the OP, an unexpected warning can hint at a typo that will go undetecte at warnlevel=1 or 0.

So, Gaia, please do not do that, for your own sake. Is is easy enough in Maple to write code free from warnings.

Just my $0.04,

M.D.

@acer: While your suggestions about warnlevel are certainly correct in a technical sense; I do wonder whether we really want to advocate turning warnings off. They can often indicate coding errors etc. and hint where one is going off-trail. Even in the case of the OP, an unexpected warning can hint at a typo that will go undetecte at warnlevel=1 or 0.

So, Gaia, please do not do that, for your own sake. Is is easy enough in Maple to write code free from warnings.

Just my $0.04,

M.D.

@ecterrab 

What I really want to do is to transform nuclear reaction cross sections from various systems (centre-of-mass, rest system of a moving particle etc.) to the lab system. I can of course do this just programming the respective formulae, but I like to use extant facilities (like the Physics package) so I can extend my calculations if I want to. 

So I think the Physics SpaceTimeVector should be the framework I need and the metric should be Minkowski by default. I haven't actually worked with it yet so I won't know for sure before I tried. Ideally, relations like the norm^2 of a 4-momentum being the rest energy come out naturally in the package. I'll find out.

Thanks,

M. D.

@ecterrab 

Starting to dig a little (actually for a beam-diagnostics project I am considering) I was looking for 4-Vectors and 4-momenta in the Physics package, but cannot find them.

Obviously I can cobble something together myself, but if it exists I'd rather use Physics for this. 

If 4-Vectors don't exist in Physics yet; wouldn't these seem like a worthwhile addition??

Mac Dude

Axel,

While I agree in general with you that a CAS should not try to become a desktop-publishing system, some of us publish papers using results or plots from Maple. Even though the rules are relaxed quite a bit nowadays, publications still need to have proper plots with correct labels and annotations. Sometimes a plot done in Maple is only doable in another program with a lot of extra work. So plots with custom tickmarks, proper typesetting in labels, what-have-you, are an important feature worth having and worth maintaining properly. I'd even argue that Maple is not leading in this dept.

Just my $0.02.

Mac Dude

 

Axel,

While I agree in general with you that a CAS should not try to become a desktop-publishing system, some of us publish papers using results or plots from Maple. Even though the rules are relaxed quite a bit nowadays, publications still need to have proper plots with correct labels and annotations. Sometimes a plot done in Maple is only doable in another program with a lot of extra work. So plots with custom tickmarks, proper typesetting in labels, what-have-you, are an important feature worth having and worth maintaining properly. I'd even argue that Maple is not leading in this dept.

Just my $0.02.

Mac Dude

 

After my reply to Alejandro Jabubi I looked at this a little more and realized that, in fact, pulling in the data form the CODATA file was quite easy since the Maple names to a large extent follow the same convention so the translation and matching up is fairly trivial. So I cobbled together a quick-and-dirty Maple procedure that pulls in the constants from the file and updates all those it finds a Maple equivalent for using ScientificConstants:-ModifyConstant.

So far so good; that part basically works. It becomes however quickly apparent that not all is well: the conversions between the units do not get updated and are now inconsistent with the values of the constants.

Example:

electron_mass is 9.10938291E-31 kg (correct 2010 value)

electron_mass*c^2 is 8.18710506545917E-14 J (Maple-calculated but in agreement with 2010 value except for some spurious digits)

convert(electron_mass*c^2,'units','MeV') is 0.510 998 683 613 828 MeV, which is wrong, the correct value is 0.510 998 928 (also in the data file)

So it appears the unit conversion does not get updated and does not use the ScientificConstant numbers (not per se surprising as these are separate packages). I poked around extensively but am not able to find where I can update the unit conversion constants. The data file has values for a number of these conversions so I can pull-in and modify these constants also.

Does anyone have a hint how to go about that?

M. D.

@Alejandro Jakubi Yeah, i was thinking of that botched Mars mission... but I don't work in that field so it seemed inappropriate as an example from me (and yes, I do work with particle accelerators).

No one in his/her right mind will use a bug in Maple (or Matlab, or whatever)  as an excuse to screw up a job. What I am concerned about here is that our favorite CAS shows some disturbing signs of lack of quality control. We all have come to accept (wrongly?) that software has some quirks. I also accept that some of the things I do with Maple are pushing the envelope and may fail (I am thinking here about things like complicated integrals that may not evaluate even numerically when they should, or may evaluate wrongly). Wrong results are never nice but I can accept that happening in borderline cases or when an algorithm gets abused.

Simple bugs may slip through although they should not. The best way of dealing with them is to release updates as they get discovered. The flood of security updates for many software products demonstrates that this is possible. Like others here I commend Maple for planning to make updates of the Physics package available as they are produced. I will argue that the ScientificConstants package warrants a similar treatment.

An ASCII table with CODATA values (not necessarily the same as PDG values) is available at http://physics.nist.gov/cuu/Constants/Table/allascii.txt. This is for the 2010 values (their latest adjustment). It is not in a perfect format for pulling into Maple and updating the ScientificConstants but it seems close enough that a Maple script of limited complexity could make it work. In fact; I am tempted to look into this although I am already juggling more than I can handle so I could not possibly commit to anything.

Note that this is for physics constants and not elements or isotopes.

M. D.

Although this horse has been beaten quite to death, let me chime in given that I work in physics and need to use physics constants fairly regularly.

For any kind of professional use at least in particle physics & related fields there have to be regular updates that make the ScientificConstants values consistent with the PDG (Particle Data Group, pdg.lbl.gov) values. The PDG puts out regular publications (of a significant volume) that give the latest accepted values, usually based on the then-latest CODATA recommended values. Sometimes (if there are significant new results not yet in CODATA) they deviate. This is what the particle physics community uses, and also the majority of the particle accelerator people use. These updates appear every 2nd year or so. The actual least-squares adjustment happens much less frequently, that is why PDG differs in some cases.

If regular updates are not practical, a way to update the values is necessary. ScientificConstants actually has that (ModifyConstant). It is not a perfect way to deal with the situation, but it is a possible way. A better way would be a means to update the underlying library procedure.

I do have a hard time to find an excuse as to why these constants do not get updated with every major, paid-for, Maple release. Heck, it can't be that difficult! PDG may well make files available if asked; after all, these guys are publicly funded.

I also (to reiterate a rant I let off before) have a hard time to accept why bugs in ScientificConstants persist over generations of Maple. In my case it is the failure to properly treat the derived constants when changing the unit system, a bug present in Maple 15 and 17. I do note that ModifyConstant got me around that one as well, but I'd rather not rely on Maple's constants when designing the next billion-$ particle collider :-).

None of this is fatal. But it is not good PR either. 

Mac Dude.

First 30 31 32 33 34 35 36 Last Page 32 of 42