Maple Questions and Posts

These are Posts and Questions associated with the product, Maple

Please house, I need help on how to compute a funtion a set of column data imported into maple and subsequently output results in a new column on the table.

I have successfully imported the table data from excel but I needed to write a code to recognize the column/columns I want to use as variables in the function and compute results using the column's data. Data table holds up to 14 x 25 datasets.

Any advice would be well appreciated.

this is a terrible maple crash just for a simple option !!!

only when i want to print the "x^2+y^2" in the caption of plot it raise exception !
but for some formulae there is no problem !
some technical guide or advise is needed ?
may someone check maple 2018 for this issue too ?

shekofte003.mw

Hello everybody,

I am quiet new to Maple and just have to program a small tool.

I need to show a conculison in a pop-up Window which should contain a matrix and a plot.

I tried different ways but they didn't work.

Thanks in advance

Hello,

it seems that Maple2017 handles the noncommutative product wrongly (while it used to do it properly up to maple 17 afaik). 

 

CODE:

with(Physics);
Setup(noncommutativeprefix={P,Q});
Q^2*P*Q + Q*P*Q^2;
simplify(%);

 

gives a correct result: BUT 

Q(t)^2*P(t)*Q(t) + Q(t)*P(t)*Q(t)^2;
simplify(%)

 

gives the result:

2Q(t)^3*P(t)  (which is wrong).

It used to work fine in Maple 17. I need to differentiate noncommutative polynomials in P(t), Q(t), which was done without problem in Maple 17 but now seems to be broken. 

 

Any explanation/workaround/fix? Is it fixed in Maple 2018?

One of the frustrating things I am finding in using Maple debugger is that I can't set breakpoint using stopat() at some line number in some inner module, if the inner module is local module. It has to be an exported module.

Here is an example

restart;
foo:=module()
   export main_entry;
   local foo1; #inner module. But if local, Can't set breakpoint from outside

   foo1:=module()
     export main_entry;
     local foo2;

     foo2:=proc() #or breakpoint in first line here
        local s,x;
        s:=1;     #suppose I want to set break point here
        s:=s+sin(x);
     end proc;
     
     main_entry:=proc()         
        foo2();
     end proc;    
  end module;

  main_entry :=proc()
    foo1:-main_entry(); #call inner module's proc
  end proc;

end module;

Since there is inner module, local to outside module, I can't set break into the inner module procs anywhere.

stopat(foo:-foo1::foo2);
Error, module does not export `foo1`

A workaround for now is to change "local" to "export" on the inner modules declarations I want to debug, so I can setbreak points inside them, then when done, make them local again. So changed "local foo1" to "export foo1" above, and now

          stopat(foo:-foo1::foo2,2);

works.

This was also a little strange to me, since foo2() is a LOCAL proc to module foo1, yet Maple did not complain like it did when module foo1 was a LOCAL module to foo. I would have expected Maple to complain again for same reason.

So I am using the above workaround for now. But it is just a little annoying becuase I have to keep changing module from LOCAL to EXPORT in order to set breakpoints, then remember to make them local again.

Is there a better workaround?

A product suggestion: make stopat() ignore the local module setting and treat it as export for the purpose of setting a break point. This way one does not have to change the code just to set a breakpoint. Or if not to change current behavior, add a new option, as in

          stopat(........, option= ignore_local_setting)

So the default remain the same as now, but with the new option, it will set breakpoint anywhere, even in local modules.

 

Thank you

 

 

How can I calculate the roots of the following  trigonometric function?

 

F[lambda[i]] := (sin(lambda[i])-tan(y*lambda[i])*cos(lambda[i]))*((1-1/B)*lambda[i]^2-k)+(lambda[i]^3/B+lambda[i])*(cos(lambda[i])-tan(`yλ`[i])*sin(lambda[i]))

 

Where : i=1..n , B=1..30, y = 0.9 and k=0.003

 

 

Hello, everyone! My name’s Sophie and I’m an intern at Maplesoft. @Samir Khan asked me to develop a couple of demonstration applications using the DeepLearning package - my work is featured on the Application Center

I thought I’d describe two critical commands used in the applications – DNNClassifier() and DNNRegressor().

The DNNClassifier calls tf.estimator.DNNClassifier from the Tensorflow Python API. This command builds a feedforward multilayer neural network that is trained with a set of labeled data in order to perform classification on similar, unlabeled data.

Dataset used for training and validating the classifier has the type DataFrame in Maple. In the Prediction of malignant/benign of breast mass example, the training set is a DataFrame with 32 columns in total, with column labels: “ID Number”, “Diagnosis”, “radius”, “texture”, etc. Note that labeling the columns of the dataset is mandatory, as later the neural network needs to identify which feature column corresponds to which list of values.

Feature columns are what come between the raw input data and the classifier model; they are required by Tensorflow to specify how the input data should be transformed before given to the model. Maple now supports three types of Feature Columns, including:

  • NumericColumn that represents real, numerical figure,
  • CategoricalColumn that denotes categorical(ordinal) data
  • BucketizedColumn that organizes continuous data into a discrete number buckets with specified boundaries.

In this application, the input data consists of 30 real, numeric values that represents physical traits of a cell nucleus computed from a digitized image of the breast mass. We create a list of NumericColumns by calling

with(DeepLearning):
fc := [seq(NumericColumn(u,shape=[1]), u in cols[3..])]:

where cols is a list of column labels and shape[1] indicates that each data input is just a single numeric value.

When we create a DNNClassifier, we need to specify the feature columns (input layer), the architecture of the neural network (hidden layers) and the number of classes (output layer). Recall that the DNNClassifier builds a feedforward multilayer neural network, hence when we call the function, we need to indicate how many hidden layers we want and how many nodes there should be on each of the layer. This is done by passing a list of non-negative integers as the parameter hidden_units when we call the function. In the example, we did:

classifier := DNNClassifier(fc, hidden_units=[20,40,20],num_classes=2):

where we set 3 hidden layer each with 20, 40, 20 nodes respectively. In addition, there are 30 input nodes (i.e. the number of feature columns) and 1 output node (i.e. binary classification). The diagram below illustrates a simpler example with an input layer with 3 nodes, 2 hidden layers with 7, 5 nodes and an output layer with 1 node.

(Created using NN-SVG by https://github.com/zfrenchee/NN-SVG)

After we built the model, we can train it by calling

classifier:-Train(train_data[3..32], train_data[2], steps = 256, num_epochs = 3, shuffle = true):

where we

  1. Give the training data (train_data[3..32]) and the corresponding labels (train_data[2]) to the model.
  2. Specified that the entire dataset will be passed to the model for three times and each iteration has 256 steps.
  3. Specified that data batches for training will be created by randomly shuffling the tensors.

Now the training process is complete, we can use the validation set to evaluate the effectiveness of our model.

classifier:-Evaluate(test_data[3..32],test_data[2], steps = 32);

The output indicates an accuracy of ~92.11% in this case. There are more indices like accuracy_basline, auc, average_loss that help us decide if we need to modify the architecture for better performance.

We then build a predictor function that takes an arbitrary set of measurements as a DataSeries and returns a prediction generated by the trained DNN classifier.

predictor := proc (ds) classifier:-Predict(Transpose(DataFrame(ds)), num_epochs = 1, shuffle = false)[1] end proc;

Now we can pass a DataSeries with 30 labeled rows to the predictor: (Recall the cols is a list of the column names)

ds := DataSeries([11.49, 14.59, 73.99, 404.9, 0.1046, 8.23E-02, 5.31E-02, 1.97E-02, 0.1779, 6.57E-02, 0.2034, 1.166, 1.567, 14.34, 4.96E-03, 2.11E-02, 4.16E-02, 8.04E-03, 1.84E-02, 3.61E-03, 12.4, 21.9, 82.04, 467.6, 0.1352, 0.201, 0.2596, 7.43E-02, 0.2941, 9.18E-02], labels = cols[3..]); 
predictor(ds);

The output indicates that the probability of this data being a class _id [0] is ~90.79%. In other words, according to our model, the probability of this breast mass cell being benign is ~90.79%.

The use of the DNNRegressor is very similar (almost identical) to that of the Classifier, the only significant difference is that while the Classifier predicts discrete labels as classes, the Regressor predicts a continuous qualitative result with the provided data (Note that CategoricalColumn is still applicable). For more details about the basic usage of the DNNRegressor, please refer to Predicting the burnt area of a forest fires with DNN Regressor.

 

An error message during opening a Maple file is seen:

ibb.co/hqOOkJ

I put the file link in below. The Maple file has been saved as Maple 2016.

https://files.fm/u/jpc3k5s4

I will be grateful if you can recover the file. Thanks

I would like to plot a graph whose legend has a Greek character with a numeric subscript. Below, there is a small example, where I show what I am trying to do:

restart:

# Simplified example, which is not working:

omega0:= 10:
plot(omega^2/omega0, omega = 1..1.5,
legend = [sprintf("%s = %.1f",`ω`[0],omega0)]);

# The character I want to be plotted inside the legend:

`ω`[0];

I suppose the problem is that " `ω[0]` is not considered a string and, therefore, I can not call it in the legend with a "%s". I do not know how to make it work, though. Does anyone know how to do so?

From help it says

"For any variable used within a procedure without being explicitly
mentioned in a local localSequence; or global globalSequence; the
following rules are used to determine whether it is local or global:

The variable is searched for amongst the locals and globals (explicit or implicit)
in surrounding procedures
, starting with the innermost.  If the name is
encountered as a parameter, local variable, or global variable of such
a surrounding procedure, that is what it refers to."

--------------------------------------------

So it seems if I do not use explicit "global" on a variable, Maple can figure
if it is global or not using the above rules. But when I use explicit "global"
on a variable, Maple did not seem to do the same thing. Here is an example
 

restart;
foo:=proc()
  local a, inner_proc;

  inner_proc := proc()
     local b;
     b   := ithprime(10);
     a   := b; #this assiged the global (to this proc)
               #variable, which is "a" correctly
  end proc;

  inner_proc();

  return(a);
end proc;

calling foo() gives 29.

But since the variable "a" in foo() is a global with respect to the inner_proc() (based on what the above help page seems to say), then why this does not work

restart;
foo:=proc()
  local a, inner_proc;

  inner_proc := proc()
     local b;
     global a;
     b   := ithprime(10);
     a   := b;                
  end proc;

  inner_proc();

  return(a);
end proc;

Now foo() returns "a". So Maple did not assign 29 to the global "a", (global to the inner_proc).

This for me makes little sense. Is global in Maple means the outermost scope only, skipping everything in between?

I thought global means any variable outside the proc itself. So if the proc() was inside another proc(), then the variables in the outer proc are global to the inner proc, even if they are declared local to the outer proc.

Why did Maple not do the assignment when I explicitly declare "a" to be global in the inner proc?

 

If you type ithprime (10^7) and hit return, the answer 179424673 is returned instantaneously, but if you do ithprime(10^8) there is no answer in ~ one hour plus (and maybe never), it just says “evaluating” and no answer comes. I have done calculations in Maple where the answer was a prime with ~18,000 digits yet cannot get beyond ithprime(10^7) when writing a code that needs to work up the prime list, looking for a particular one. Has anyone else encountered this apparent limitation in Maple?

Is there a way around it? 

I'm having a hard time trying to understand and use the various fitting functions.

For starters, if I copy and paste the code from the example on the Maple support Help page for NonlinearFit(), I'm not able to successfully execute the code. It gives the error:

Error, (in Statistics:-NonlinearFit) complex value encountered

I've attached my Maple file: testing_nonlinearfit.mw

Hello,

     I'm having trouble simplifying this square root:

assume(p::positive):
sqrt(sqrt(p^2+1)-1)*sqrt(sqrt(p^2+1)+1):
expand(%);

I expect it to give just p but instead it returns the full, unsimplifed expression.

This appears as part of a larger expression, and if it fails to simplify, terms won't cancel and the expression is much longer than it needs to be.

Thanks!

Mukhametshina Liya

Games with pseudo-fractals
 

Homothety_Fractals.mw

 

  

  

First 795 796 797 798 799 800 801 Last Page 797 of 2217