MaplePrimes Posts

MaplePrimes Posts are for sharing your experiences, techniques and opinions about Maple, MapleSim and related products, as well as general interests in math and computing.

Latest Post
  • Latest Posts Feed
  • For years I've been angry that Maple isn't capable of formally manipulating random vectors (aka multivariate random variables).
    For the record Mathematica does.

    The problem I'm concerned with is to create a vector W such that

    type(W, RandomVariable)

    will return true.
    Of course defining W from its components w1, .., wN, where each w is a random variable is easy, even if these components are correlated or, more generally dependent ( the two concepts being equivalent iif all the w are gaussian random variables).
    But one looses the property that W is no longer a (multivariate) random variable.
    See a simple example here: NoRandomVectorsInMaple.mw

    This is the reason why I've developped among years several pieces of code to build a few multivariate random variable (multinormal, Dirichlet, Logistic-Normal, Skew Multivariate Normal, ...).

    In the framework of my activities, they are of great interest and the purpose of this post is to share what I have done on this subject by presenting the most classic example: the multivariate gaussian random variable.

    My leading idea was (is) to build a package named MVStatistics on the image of the Statistics package but devoted to Multi Variate random variables.
    I have already construct such a package aggregating about fifty different procedures. But this latter doesn't merit the appellation of "Maple package" because I'm not qualified to write something like this which would be at the same time perennial, robust, documented, open and conflict-free with the  Statistics package.
    In case any of you are interested in pursuing this work (because I'm about to change jobs), I can provide it all the different procedures I built to construct and manipulate multivariate random variables.

    To help you understand the principles I used, here is the most iconic example of a multivariate gaussian random variable.
    The attached file contains the following procedures

    MVNormal
      Constructs a gaussian random vector whose components can be mutually correlated
      The statistics defined in Distribution are: (this list could be extended to other
      statistics, provided they are "recognized" statitics, see at the end of this 
      post):
          PDF
          Mode
          Mean
          Variance
          StandardDeviation = add(s[k]*x[k], k=1..K)
          RandomSample
    
    DispersionEllipse
      Builds and draws the dispersion ellipses of a bivariate gaussia, random vector
    
    DispersionEllipsoid
      Builds and draws the dispersion ellipsoids of a trivariate gaussia, random vector
    
    MVstat
      Computes several statistics of a random vector (Mean, Variance, ...)
    
    Iserlis
      Computes the moments of any order of a gaussian random vector
    
    MVCentralMoment
      Computes the central moments of a gaussian random vector
    
    Conditional
      Builds the conditional random vector of a gaussian random vector wrt some of its components 
      the moments of any order of a gaussian random vector.
      Note: the result has type RandomVariable.
    
    MarginalizeAgainst
      Builds the marginal random vector of a gaussian random vector wrt some of its components 
      the moments of any order of a gaussian random vector.
      Note: the result has type RandomVariable.
    
    MardiaNormalityTest
      The multi-dimensional analogue of the Shapiro-Wilks normality test
    
    HZNormalityTest
      Henze-Zirkler test for Multivariate Normality
    
    MVWaldWolfowitzTest
      A multivariate version of the non-parametrix Wald-Folfowitz test
    

    Do not hesitate to ask me any questions that might come to mind.
    In particular, as Maple introduces limitations on the type of some attributes (for instance Mean  must be of algebraic type), I've been forced to lure it by transforming vector or matrix quantities into algebraic ones.
    An example is

    Mean = add(m[k]*x[k], k=1..K)

    where m[k] is the expectation of the kth component of this random vector.
    This implies using the procedure MVstat to "decode", for instance, what Mean returns and write it as a vector.

    MultivariateNormal.mw

    About the  statistics ths Statistics:-Distribution constructor recognizes:
    To get them one can do this (the Normal distribution seems to be the continuous one with the most exhaustive list os statistics):

    restart
    with(Statistics):
    X := RandomVariable(Normal(a, b)):
    attributes(X);
          protected, RandomVariable, _ProbabilityDistribution
    
    map(e -> printf("%a\n", e), [exports(attributes(X)[3])]):
    Conditions
    ParentName
    Parameters
    CharacteristicFunction
    CDF
    CGF
    HodgesLehmann
    Mean
    Median
    MGF
    Mode
    PDF
    RousseeuwCrouxSn
    StandardDeviation
    Support
    Variance
    CDFNumeric
    QuantileNumeric
    RandomSample
    RandomSampleSetup
    RandomVariate
    MaximumLikelihoodEstimate
    

    Unfortunately it happens that for some unknown reason a few statistics cannot be set by the user.
    This is for instance the case of Parameters serious consequences in certain situations.
    Among the other statistics that cannot be set by the user one finds:

    • ParentName,
    • QuantileNumeric  whose role is not very clear, at least for me, but which I suspect is a procedure which "inverts" the CDF to give a numerical estimation of a quantile given its probability.
      If it is so accessing  QuantileNumeric would be of great interest for distributions whose the quantiles have no closed form expressions.
    • CDFNumeric  (same remark as above)

    Finally, the statistics Conditions, which enables defining the conditions the elements of Parameters must verify are not at all suited for multivariate random variables.
    It is for instance impossible to declare that the variance matrix (or the correlation matrix) is a square symmetric positive definite matrix).

    A new feature has been released on Maple Learn called “collapsible sections”! This feature allows for users to hide content within sections on the canvas. You can create a section by highlighting the desired text and clicking this icon in the top toolbar:


    “Well, when can I actually use sections?” you may ask. Let me walk you through two quick scenarios so you can get an idea.


    For our first scenario, let’s say you’re an instructor. You just finished a lesson on the derivatives of trigonometric functions and you’re now going through practice problems. The question itself is not long enough to hide the answers, so you’re wondering how you can cover the two solutions below so that the students can try out the problem themselves first.




     

    Before, you might have considered hyperlinking a solution document or placing the solution lower down on the page. But now, collapsible sections have come to the rescue! Here’s how the document looks like now:  


     

    You can see that the solutions are now hidden, although the section title still indicates which solution it belongs to. Now, you can 1) keep both solutions hidden, 2) show one solution at a time, or 3) show solutions side-by side and compare them!

    Now for the second scenario, imagine you’re making a document which includes a detailed visualization such as in Johnson and Jackson’s proof of the Pythagorean theorem. You want the focus to be on the proof, not the visualizations commands that come along with the proof. What do you do?


    It’s an easy solution now that collapsible sections are available!


    Now, you can focus on the proof without being distracted by other information—although the visualization commands can still be accessed by expanding the section again.

    So, take inspiration and use sections to your advantage! We will be doing so as well. you may gradually notice some changes in existing documents in the Maple Learn Gallery as we update them to use collapsible sections. 

    Happy document-making!

    A Flow and Maple user wonders why Maple Flow may evaluate to high-precision, floating point numbers compared to the same commands used in Maple that evaluate to simple, concise answers.

     

     

    We suggest the same results can be achieved by toggling the numeric/symbolic evaluation mode toggle in the Flow math container(s)

     

     

    primes-flow-evaluation-modes.flow

     

    For more information, please see section 3.5 of the Maple Flow User Manual (Numeric and Symbolic Evaluation Modes). 

     

    A new collection has been released on Maple Learn! The new Pascal’s Triangle Collection allows students of all levels to explore this simple, yet widely applicable array.

    Though the binomial coefficient triangle is often referred to as Pascal’s Triangle after the 17th-century mathematician Blaise Pascal, the first drawings of the triangle are much older. This makes assigning credit for the creation of the triangle to a single mathematician all but impossible.

    Persian mathematicians like Al-Karaji were familiar with the triangular array as early as the 10th century. In the 11th century, Omar Khayyam studied the triangle and popularised its use throughout the Arab world, which is why it is known as “Khayyam’s Triangle” in the region. Meanwhile in China, mathematician Jia Xian drew the triangle to 9 rows, using rod numerals. Two centuries later, in the 13th century, Yang Hui introduced the triangle to greater Chinese society as “Yang Hui’s Triangle”. In Europe, various mathematicians published representations of the triangle between the 13th and 16th centuries, one of which being Niccolo Fontana Tartaglia, who propagated the triangle in Italy, where it is known as “Tartaglia’s Triangle”. 

    Blaise Pascal had no association with the triangle until years after his 1662 death, when his book, Treatise on Arithmetical Triangle, which compiled various results about the triangle, was published. In fact, the triangle was not named after Pascal until several decades later, when it was dubbed so by Pierre Remond de Montmort in 1703.

    The Maple Learn collection provides opportunities for students to discover the construction, properties, and applications of Pascal’s Triangle. Furthermore, students can use the triangle to detect patterns and deduce identities like Pascal’s Rule and The Binomial Symmetry Rule. For example, did you know that colour-coding the even and odd numbers in Pascal’s Triangle reveals an approximation of Sierpinski’s Fractal Triangle?

    See Pascal’s Triangle and Fractals

    Or that taking the sum of the diagonals in Pascal's Triangle produces the Fibonacci Sequence?

    See Pascal’s Triangle and the Fibonacci Sequence

    Learn more about these properties and discover others with the Pascal’s Triangle Collection on Maple Learn. Once you are confident in your knowledge of Pascal’s Triangle, test your skills with the interactive Pascal’s Triangle Activity

     

    On November 11th, Canada and other Commonwealth member states will celebrate Remembrance Day, also known as Armistice Day. This holiday commemorates the armistice signed by Germany and the Entente Powers in Compiègne, France on November 11, 1918, to end the hostilities on the Western Front of World War I. The armistice came into effect at 11:00 am that morning – the “eleventh hour of the eleventh day of the eleventh month”. 

    Similar to how November 11th – which can be written as 11/11 – is a palindromic date that reads the same forward and backward, last year there was “Twosday” – February 22, 2022, also written 22/2/22. 

    Palindromic dates like November 11th that consist only of a day and a month happen every year, but how long will we have to wait until the next “Twosday”? We can use Maple Learn’s new Calendar Calculator to find out!


    To use this document, simply input two dates and press ‘Calculate’ to find the amount of time between them, presented in a variety of units. For example, here are the results for the number of days left until Christmas from November 11th of this year:


    If we return to our original question, which concerns how long we’ll have to wait until the next “Twosday”, we can use this document to find our answer:

    You can use this document as a countdown to find out how much time is left until your favorite holiday, your next birthday, or the time between now and any past or future date; try out the countdown document here!

    We have just released updates to Maple and MapleSim.

    Maple 2023.2 includes a strikethrough character style, a new unit system, improved behavior when editing or deleting subscripts, improved find-and-replace, better mouse selection of piecewise functions and the contents of matrices, and moreWe recommend that all Maple 2023 users install this update.

    This update also include a fix to the problem with setoptions3d, as first reported on MaplePrimes. Thanks, as always, for helping us make Maple better.

    This update is available through Tools>Check for Updates in Maple, and is also available from the Maple 2023.2 download page, where you can find more details.

    At the same time, we have also released an update to MapleSim, which contains a variety of improvements to MapleSim and its add-ons. You can find more information on the MapleSim 2023.2 download page.

     

    Many everyday decisions are made using the results of coin flips and die rolls, or of similar probabilistic events. Though we would like to assume that a fair coin is being used to decide who takes the trash out or if our favorite soccer team takes possession of the ball first, it is impossible to know if the coin is weighted from a single trial.

     

    Instead, we can perform an experiment like the one outlined in Hypothesis Testing: Doctored Coin. This is a walkthrough document for testing if a coin is fair, or if it has been doctored to favor a certain outcome. 

     

    This hypothesis testing document comes from Maple Learn’s new Estimating collection, which contains several documents, authored by Michael Barnett, that help build an understanding of how to estimate the probability of an event occurring, even when the true probability is unknown.

    One of the activities in this collection is the Likelihood Functions - Experiment document, which builds an intuitive understanding of likelihood functions. This document provides sets of observed data from binomial distributions and asks that you guess the probability of success associated with the random variable, giving feedback based on your answer. 

     

     

    Once you’ve developed an understanding of likelihood functions, the next step in determining if a coin is biased is the Maximum Likelihood Estimate Example – Coin Flip activity. In this document, you can run as many randomized trials of coin flips as you like and see how the maximum likelihood estimate, or MLE, changes, bearing in mind that if a coin is fair, the probability of either heads or tails should be 0.5. 

     

     

    Finally, in order to determine in earnest if a coin has been doctored to favor one side over the other, a hypothesis test must be performed. This is a process in which you test any data that you have against the null hypothesis that the coin is fair and determine the p-value of your data, which will help you form your conclusion.

    This Hypothesis Testing: Doctored Coin document is a walkthrough of a hypothesis test for a potentially biased coin. You can run a number of trials on this coin, determine the null and alternative hypotheses of your test, and find the test statistic for your data, all using your understanding of the concepts of likelihood functions and MLEs. The document will then guide you through the process of determining your p-value and what this means for your conclusion.

    So if you’re having suspicions that a coin is biased or that a die is weighted, check out Maple Learn’s Estimating collection and its activities to help with your investigation!

    The Maple Conference starts tomorrow Oct. 26 at 9am EDT! It's not too late to register: https://www.maplesoft.com/mapleconference/2023/. Even if you can't attend all the presentations, registration will allow you to view the recorded videos after the conference. 

    Check out the detailed conference program here: https://www.maplesoft.com/mapleconference/2023/full-program.aspx

    This is an successfull attempt to simulate space frames in MapleSim using the relatively new Rod component.

    At t=3s, a lateral force component is applied to make the simulation more interesting.

    The structure collapses/folds in an origami style fashion.

    To build the model, MapleSim needs additional components.

    For example, an equilateral triangle

    requires the addition of rigid body frames at the connection of two rods.

    Additionally, the rigid body frames must have initial position conditions (ICs) that match the intended structure.

    Interestingly the ICs do not have to be set to Treat as Guess. It is only required to put an approximate coordinate. Leaving the ICs on Ignore was sufficient for the attached model.

    Rod components can be replaced by Flexible Beam components which require considerably more simulation time and either Revolute joints at their ends or a rather complex connection with Rigid Body frames (of zero length) to adjacent Flexible Beam components.

     

    Spaceframe_2.msim

     

    With Halloween right around the corner, we at Maplesoft wanted to celebrate the occasion with an activity where you can carve your own pumpkin… using math! 

     

    Halloween is said to have originated a few hundred years back in ancient Celtic festivals, specifically one called Samhain. This was celebrated from October 31st to November 1st to mark the end of harvesting season and the beginning of winter, or the "darker quarter" of the year. Since then, Halloween has evolved into a fun celebration of candy and costumes in many countries!

     

    With that said, here’s my take on the pumpkin carving activity: 

     

     

    The great thing is, if you mess up, you can always go back; unlike carving pumpkins in real life. My design is pretty simple (although cute), so let’s see what you all can impress us with!

     

    You can also make your own original art and publish it to your channel so that anyone can see your own artistic creations. You can also attend the Maple Conference next week on October 26 and 27, an event filled with two days of presentations from members of the Maplesoft Community. Participants will also be able to see all the artwork submitted for the Art Gallery and Creative Showcase, where you can draw inspiration for your own submissions to next year’s showcase! The conference is virtual and free of charge, and you can register here.

     

    Looking forward to seeing you there!

    The Maple Conference will be starting in two weeks! The detailed agenda, which includes abstracts of invited and contributed talks, is available here: https://www.maplesoft.com/mapleconference/2023/full-program.aspx.

    Please join us on October 26 and 27 for two days of presentations from our staff members and the larger Maple community, a look at our Art Gallery and Creative Showcase, opportunities for networking with other Maple enthusiasts, and more! The conference is virtual and free of charge, and you can register at https://www.maplesoft.com/mapleconference/2023/.

    We look forward to seeing you at the conference!

     

    Almost 300 years ago, a single letter exchanged between two brilliant minds gave rise to one of the most enduring mysteries in the world of number theory. 

    In 1742, Christian Goldbach penned a letter to fellow mathematician Leonhard Euler proposing that every even integer greater than 2 can be written as a sum of two prime numbers. This statement is now known as Goldbach’s Conjecture (it is considered a conjecture, and not a theorem because it is unproven). While neither of these esteemed mathematicians could furnish a formal proof, they shared a conviction that this conjecture held the promise of being a "completely certain theorem." The following image demonstrates how prime numbers add to all even numbers up to 50:

    From its inception, Goldbach's Conjecture has enticed generations of mathematicians to seek evidence of its legitimacy. Though weaker versions of the conjecture have been proved, the definitive proof of the original conjecture has remained elusive. There was even once a one-million dollar cash prize set to be awarded to anyone who could provide a valid proof, though the offer has now elapsed. While a heuristic argument, which relies on the probability distribution of prime numbers, offers insight into the conjecture's likelihood of validity, it falls short of providing an ironclad guarantee of its truth.

    The advent of modern computing has emerged as a beacon of progress. With vast computational power at their disposal, contemporary mathematicians like Dr. Tomàs Oliveira e Silva have achieved a remarkable feat—verification of the conjecture for every even number up to an astonishing 4 quintillion, a number with 18 zeroes.

    Lazar Paroski’s Goldbach Conjecture Document on Maple Learn offers an avenue for users of all skill levels to delve into one of the oldest open problems in the world of math. By simply opening this document and inputting an even number, a Maple algorithm will swiftly reveal Goldbach’s partition (the pair of primes that add to your number), or if you’re lucky it could reveal that you have found a number that disproves the conjecture once and for all.

    A salesperson wishes to visit every city on a map and return to a starting point. They want to find a route that will let them do this with the shortest travel distance possible. How can they efficiently find such a route given any random map?

    Well, if you can answer this, the Clay Mathematics Institute will give you a million dollars. It’s not as easy of a task as it sounds.

    The problem summarized above is called the Traveling Salesman Problem, one of a category of mathematical problems called NP-complete. No known efficient algorithm to solve NP-complete problems exists. Finding a polynomial-time algorithm, or proving that one could not possibly exist, is a famous unsolved mathematical problem.

    Over years of research, many advances have been made in algorithms that can solve the problem, not in perfectly-efficiently time, but quickly enough for many smaller examples that you can hardly notice. One of the most significant Traveling Salesman Problem solutions is the Concorde TSP Solver. This program can find optimal routes for maps with thousands of cities.

    Traveling Salesman Problems can also be used outside of the context of visiting cities on a map. They have been used to generate gene mappings, microchip layouts, and more.

    The power of the legendary Concorde TSP Solver is available in Maple. The TravelingSalesman command in the GraphTheory package can find the optimal solution for a given graph. The procedure offers a choice of the recently added Concorde solver or the original pure-Maple solver.

    To provide a full introduction to the Traveling Salesman Problem, we have created an exploratory document in Maple Learn! Try your hand at solving small Traveling Salesman examples and comparing different paths. Can you solve the problems as well as the algorithm can?

     

    # ----------------------------------THE DESIGN OF THE MAPLET SCREEN---------------------
    with(Maplets[Elements]):
    HCC:=Maplet(Window('title'="HEAT CONDUCTIVITY CONTROL",["WITH THIS APPLICATION THE CONDUCTIVITY COEFFICIENT OF A ONE-DIMENSIONAL OBJECT, APPROXIMATING THE TEMPRATURE OF THE OBJECT TO A TARGET TEMPRETURE AT A CERTAIN FINAL TIME, IS CONTROLLED. ",[["l",TextField[l](3)],["T",TextField[T](3)],["f(x,t)",TextField[f](15)],["phi(x)",TextField[ph](5)]],[["k(0)",TextField[k0](3)],["g0(t)",TextField[g0](10)],["k(l)",TextField[kl](3)],["g1(t)",TextField[g1](10)],["mu(x)",TextField[mu](10)]],[["alpha",TextField[alpha](3)],["kaplus(x)",TextField[kaplus](5)],["N",TextField[N](3)],["kstart(x)",TextField[kstart](3)],["beta",TextField[beta](3)],["eps",TextField[eps](3)]] ,[Button("Calculate the Control",Evaluate('kutu'=ms(N,l,alpha,T,ph,f,g0,g1,mu,kaplus,k0,kl,kstart,beta,eps))),[TextBox['kutu'](30..30)],Button("Draw the Control",Evaluate('Draw'='plot(kutu,x=0..l)')),Plotter['Draw'](),[[Button("Distance to
    Target",Evaluate('kutu2'=ms8(N,l,alpha,T,ph,f,g0,g1,mu,kaplus,k0,kl,kstart,beta,eps))),TextField['kutu2'](12)],[Button("Approximation to kaplus",Evaluate('kutu3'='evalf(int((kutu-kaplus)^2,x=0..l))')),TextField['kutu3'](12)]]],Button("Shutdown",Shutdown())])):
    # -------------------------PROCEDURE FOR CALCULATION OF THE CONTROL FUNCTION-----------
    with(inttrans):
    with(linalg):
    ms:=proc(N,l,alpha,T,ph,f,g0,g1,mu,kaplus,k0,kl,kstart,beta,eps):
    with(inttrans):
    with(linalg):
    w:=simplify(x^2/2*g1/(l*kl)+(x^2/2-x*l)*g0/(l*k0)):
    phdal:=ph-subs(t=0,w):
    fdal:=simplify(f-diff(w,t)+diff(kaplus*diff(w,x),x)):
    # ---------------------------------Solution of the Heat Problem------------------------------------
    dp:=proc(ka)
    with(inttrans):
    with(linalg):
    phi:=Vector(1..N):
    phi[1]:=1/sqrt(l):
    for i from 2 to N do
    phi[i]:=evalf(sqrt(2/l)*cos((i-1)*Pi*x/l)):
    od:
    K:=Array(1..N,1..N):
    for j from 1 to N do
    for k from 1 to N do
    K[j,k]:=evalf(-int(ka*diff(phi[k],x$2)*phi[j],x=0..l)):
    od:
    od:
    F:=Vector(1..N):
    for n from 1 to N do
    F[n]:=evalf(int(fdal*phi[n],x=0..l)):
    od:
    A:=Vector(1..N):
    for m from 1 to N do
    A[m]:=evalf(int(phdal*phi[m],x=0..l)):
    od:
    KL:=Matrix(1..N,1..N):
    for j1 from 1 to N do
    for k1 from 1 to N do
    if (j1=k1) then KL[j1,k1]:=s+K[j1,k1] else KL[j1,k1]:=K[j1,k1] fi:
    od:
    od:
    FL:=Vector(1..N):
    for i1 from 1 to N do
    FL[i1]:=evalf(laplace(F[i1],t,s));
    od:
    S:=Vector(1..N):
    for i2 from 1 to N do
    S[i2]:=(A[i2]+FL[i2]);
    od:
    C:=Vector(1..N):
    C:=evalm(inverse(KL)&*S):
    c:=Vector(1..N):
    for i3 from 1 to N do
    c[i3]:=evalf(invlaplace(C[i3],s,t)):
    od:
    v:=evalf(add(c[n1]*phi[n1],n1=1..N)):
    uyak:=v+w;
    end:
    # ---------------------------------Solution of the Adjoint Problem------------------------------------
    ap:=proc(ka)
    with(inttrans):
    with(linalg):
    utau:=evalf(subs(t=T-tau,dp(ka))):
    phe:=evalf(2*(subs(tau=0,utau)-mu));
    phie:=Vector(1..N):
    phie[1]:=1/sqrt(l):
    for i4 from 2 to N do
    phie[i4]:=evalf(sqrt(2/l)*cos((i4-1)*Pi*x/l)):
    od:
    Kc:=Array(1..N,1..N):
    for j2 from 1 to N do
    for k2 from 1 to N do
    Kc[j2,k2]:=evalf(-int(ka*diff(phie[k2],x$2)*phie[j2],x=0..l)):
    od:
    od:
    Fc:=Vector(1..N):
    for m1 from 1 to N do
    Fc[m1]:=0:
    od:
    Ac:=Vector(1..N):
    for cm1 from 1 to N do
    Ac[cm1]:=evalf(int(phe*phie[cm1],x=0..l)):
    od:
    KLC:=Matrix(1..N,1..N):
    for cj1 from 1 to N do
    for ck1 from 1 to N do
    if (cj1=ck1) then KLC[cj1,ck1]:=s+Kc[cj1,ck1] else KLC[cj1,ck1]:=Kc[cj1,ck1] fi:
    od:
    od:
    FLC:=Vector(1..N):
    for ci1 from 1 to N do
    FLC[ci1]:=evalf(laplace(Fc[ci1],tau,s));
    od:
    Sc:=Vector(1..N):
    for ci2 from 1 to N do
    Sc[ci2]:=(Ac[ci2]+FLC[ci2]);
    od:
    CC:=Vector(1..N):
    CC:=evalm(inverse(KLC)&*Sc):
    cc:=Vector(1..N):
    for ci3 from 1 to N do
    cc[ci3]:=evalf((invlaplace(CC[ci3],s,tau))):
    od:
    ve:=evalf(add(cc[cn]*phie[cn],cn=1..N)):
    eta:=evalf(subs(tau=T-t,ve));
    end:
    # ---------------------------------Calculation of the Gradient----------------------------------
    T�rev:=proc(alpha,ka)
    T�re:=simplify(evalf(-int(diff(dp(ka),x)*diff(ap(ka),x),t=0..T)+2*alpha*(ka-kaplus)));
    end:
    # ----------------------------Calculation of the Cost Functional--------------------------------
    Jka:=proc(ka)
    IJ1:=evalf(int((subs(t=T,dp(ka))-mu)^2,x=0..l));
    end:
    Sta:=proc(ka)
    IJ2:=simplify(evalf((int((ka-kaplus)^2,x=0..l))));
    end:
    II:=proc(ka)
    IJ:=simplify(evalf(Jka(ka)+alpha*Sta(ka))):
    end:# 
    # -----------------------------------Minimizing Process--------------------------------------------
    a[0]:=kstart:
    ka[0]:=kstart:
    say�:=0:
    for im from 0 to 60 do
    a[im+1]:=simplify(evalf(ka[im-say�]-beta*T�rev(alpha,ka[im-say�]))): 
    fark:=evalf(II(ka[im-say�])-II(a[im+1])): 
    if(fark>0 and fark<eps) then break elif (fark>0) then 
    j:=im+1: ka[j-say�]:=a[im+1]:   elif(fark<=0) then  say�:=say�+1: beta:=beta/(1.2): ka[im-say�+2]:=ka[im-say�+1]:   else fi:
    od:
    optcont:=a[im+1]:
    end:
    # -------------------------END OF THE PROCEDURE FOR CALCULATION OF THE CONTROL FUNCTION-----------
    # ------PROCEDURE FOR CALCULATION OF THE DISTANCE TO THE TARGET FUNCTION-----------
    ms8:=proc(N,l,alpha,T,ph,f,g0,g1,mu,kaplus,k0,kl,kstart,beta,eps):
    with(inttrans):
    with(linalg):
    w8:=simplify(x^2/2*g1/(l*kl)+(x^2/2-x*l)*g0/(l*k0)):
    phdal8:=ph-subs(t=0,w8):
    fdal8:=simplify(f-diff(w8,t)+diff(kaplus*diff(w8,x),x)):
    phi8:=Vector(1..N):
    phi8[1]:=1/sqrt(l):
    for i8 from 2 to N do
    phi8[i8]:=evalf(sqrt(2/l)*cos((i8-1)*Pi*x/l)):
    od:
    K8:=Array(1..N,1..N):
    for j8 from 1 to N do
    for k8 from 1 to N do
    K8[j8,k8]:=evalf(-int(ms(N,l,alpha,T,ph,f,g0,g1,mu,kaplus,k0,kl,kstart,beta,eps)*diff(phi8[k8],x$2)*phi8[j8],x=0..l)):
    od:
    od:
    F8:=Vector(1..N):
    for m28 from 1 to N do
    F8[m28]:=evalf(int(fdal8*phi8[m28],x=0..l)):
    od:
    A8:=Vector(1..N):
    for m8 from 1 to N do
    A8[m8]:=evalf(int(phdal8*phi8[m8],x=0..l)):
    od:
    KL8:=Matrix(1..N,1..N):
    for j18 from 1 to N do
    for k18 from 1 to N do
    if (j18=k18) then KL8[j18,k18]:=s+K8[j18,k18] else KL8[j18,k18]:=K8[j18,k18] fi:
    od:
    od:
    FL8:=Vector(1..N):
    for i148 from 1 to N do
    FL8[i148]:=evalf(laplace(F8[i148],t,s));
    od:
    S8:=Vector(1..N):
    for i48 from 1 to N do
    S8[i48]:=(A8[i48]+FL8[i48]);
    od:
    C8:=Vector(1..N):
    C8:=evalm(inverse(KL8)&*S8):
    c8:=Vector(1..N):
    for i58 from 1 to N do
    c8[i58]:=evalf(invlaplace(C8[i58],s,t)):
    od:
    v8:=evalf(add(c8[n8]*phi8[n8],n8=1..N)):
    uyak8:=v8+w8;
    IJ18:=evalf(int((subs(t=T,uyak8)-mu)^2,x=0..l));
    end:
    # ------END OF THE PROCEDURE FOR CALCULATION OF THE DISTANCE TO THE TARGET FUNCTION-----------
    Maplets[Display](HCC):


     

    Deleted posts should go into a seperate container on mapleprimes for review by the original poster.

    In the past some have been deleted by accident and others for good reasons and others just because. 

    The idea to put it into a container is so accidental deletes can be recovered and not lost.  A legitimate delete of a post is if it provides no value to the original question.

    First 10 11 12 13 14 15 16 Last Page 12 of 306