mmcdara

4019 Reputation

17 Badges

6 years, 177 days

MaplePrimes Activity


These are replies submitted by mmcdara

@Rouben Rostamian  

Hi, in my reply to @Oliveira I assumed, for illustrative purpose, that u(x, 0)=0.
I'm curious wether Maple is capable to solve this problem.

If I'm not mistaken pdsolve doesn't accept non rectangular domains.
In this case a solution is to solve a diffusion equation over Omega = (0, 2) x (0, 1)

PDE := diff(lambda(x, y) * diff(u(x, y), x), x) + diff(lambda(x, y) * diff(u(x, y), y), yx) = 0

where the diffusion coefficient lamba(x, y) takes a value 1 within your triangular domain T and a very large value L over Omega \ T.
Given the following boundary conditions

BC := {
        u(x, 0) = 0, 
        u(0, y) = 20, 
        u(2, y) = 100, 
        u(x, 1) = piecewise(x <=1, 20, 100)
      }

and provided L >> 1,  u(x, y) should verify

  1. if x < 1 : u(x, x) ~ 20
  2. if x > 1 : u(x, 2-x) ~ 100

This is a classical trick to solve a pde on a non rectangular domain.

In your case it seems that pdsolve cannot return a formal solution (but I'm not familiar with pdsolve) and that pdsolve cannot solve numerically elliptic pdes.

Hint.mw

Hope you will receive a more enlightened opinion

Thanks @acer 

About the use of UseHardwareFloats:=false ... it was a remnant of an old coding.
Explanation
My first code, based on @wzelik pseudo-code, used a Uniform (-h, h) proposal.
I think I observed that rand(-h .. +h) is faster than Sample(Uniform(-f, +h), 1)[1].
So my code was (onl

... 
local beta_effective_range := Probability(B > 0.99, numeric) - Probability(B < 0.01, numeric);
local proposal             := rand(-beta_effective_range/shrink .. beta_effective_range/shrink):
  local ARtest               := rand(0. .. 1.):
...

When I ran it I got this message error
Error, (in RandomTools:-Generate) invalid input: unknown expects value for keyword parameter range to be of type integer .. integer, but received -190272131778741471/50000000 .. 190272131778741471/50000000
which comes from the line 

Sample(B, 1)[1] + proposal()

because Sample(B, 1)[1] is an hfloat number while proposal() is a float.

I then set UseHardwareFloats:=false to remove this error.
Another solution could have been to write 

local proposal := RandomVariable(Uniform(-beta_effective_range/shrink, beta_effective_range/shrink)):

When I decided to use a gausssian proposal 

local proposal := RandomVariable(Normal(0, beta_effective_range/shrink)):

then both Sample(B, 1)[1] and proposal() were hfloats ... but I forgot to remove the declaration 

UseHardwareFloats:=false

My mistake.

About the uway you construct helper:
I think I understand a little better what you did... but I'm not sure I can do it again on my own.

@wzelik 

I told you there were wizards here

@acer 

I was hoping that you or Carl or someone else would come up with something much more clever than what I had done.
Well done.

Some points:

  • I hadn't noticed that ARtest could be an instanciated this way (even clearluy stated in the related help page)
    Sample(Uniform(0,1), ARtest)
  • Could you explain me why you wrote this ?
    helper := subs(__d1=beta_pdf(g),  
                 proc(..) 
                 ... 
                 L := binomial(N, K)*g^K*(1 - g)^(N - K) * __d1;
                 ...
                 end proc 
              )
  • Is the the up-front computing of ARtest and proposal the main reason of the performances ?  

TIA

@wzelik 
It wasn't my intention to play the teacher, simply I didn'y know what was your knowledge about MCMC and bayesian inference.
I must apologize for not having paid much attention to your Maple code.(for the next time remember to use the big green up arrow to load your worksheet) and focused mainly on your R-like pseudo-code instead. This explains why I said in my last reply that I used a Gaussian proposal instead of a Uniform (R-like pseudo-code, as your Maple codeuses a gGaussian proposal too).

In my worksheet I did not use Quantile, which is often a "slow" procedure.
A simpler and more efficient way to do the operation 

Quantile(Normal(a, s), evalf(rand()/10^12), numeric=true); 

is to Sample the Normal(a, s) distribution (even as you draw only one sample at the time the differences are not that spectacular):

Sample(Normal(a, s), 1)[1]

Look to the attached file perf.mw

A common practice to avoid building a long Markov Chain is run in parallel several Markov Chains with different starting points. Running P chains on P processors simply divide the whole time by P, provided P doesn't the number of processors of your PC.
Gibe a look to the Grid Package if you are interested.

I use to use R and, as a general rule, R is much faster than Maple (ho has other advantages a well). 
Without advanced programming in Maple (and I am not able to do that, unlike others here) it is very unlikely that you will be able to achieve the same performance as with R.

perf.mw

Last point, I often use the ESS criteria (Effective Sample Size) in Maple for it is easy to implement.
In case you don't know it see for instance 
https://mc-stan.org/docs/2_18/reference-manual/effective-sample-size-section.html

@wzelik

Here is a MCMC builder.
Improvements are likely.
I use a Gaussian proposal which has better properties than a Uniform one.

restart:

with(Statistics):

mcmc := proc(N, K, a, b, Length, shrink)
  local like                 := p -> binomial(N, K)*p^K*(1 - p)^(N - K);
  local B                    := RandomVariable('Beta'(a, b)):
  local beta_pdf             := unapply(PDF(B, x), x):
  local beta_effective_range := Probability(B > 0.99, numeric) - Probability(B < 0.01, numeric);
  local proposal             := RandomVariable(Normal(0, beta_effective_range/shrink)):
  local ARtest               := rand(0. .. 1.):
  local chain                := Matrix(Length, 2):

  local n, is_out, guess, L, R, t:

  chain[1, 1]          := Sample(B, 1)[1]:
  chain[1, 2]          := like(chain[1, 1]) * beta_pdf(chain[1, 1]):
  for n from 2 to Length do
    is_out := true:
    while is_out do
      guess  := chain[n-1, 1] + Sample(proposal, 1)[1]:
      is_out := `or`(guess < 0, guess > 1)
    end do:
    L := like(guess) * beta_pdf(guess):
    R := L / chain[n-1, 2]:
    t := ARtest():
    if R > t then
      chain[n, 1] := guess:
      chain[n, 2] := L:
    else
      chain[n, 1] := chain[n-1, 1]:
      chain[n, 2] := chain[n-1, 2]:
    end if:
  end do:
  return chain
end proc:

UseHardwareFloats:=false:

L    := 5000:
Burn := 1000:
MC   := CodeTools:-Usage( mcmc(4, 2, 21, 1, L, 5) ):

plots:-display(
  Histogram(MC[.., 1]),
  plot(PDF('Beta'(21+2, 1+4-2), z), z=(min..max)(MC[Burn..L, 1]), color=red, thickness=2)
);

sort(MC[Burn..L, 1])[round~([(L-Burn)*0.025, (L-Burn)*0.975])];
ScatterPlot(`<,>`($1..L), MC[.., 1], symbol=point);

memory used=94.44MiB, alloc change=0 bytes, cpu time=1.15s, real time=1.39s, gc time=0ns

 

 

Vector[column]([[.7267474603], [.9699243929]])

 

 

UseHardwareFloats:=false:

L    := 10000:
Burn := 1000:
MC   := CodeTools:-Usage( mcmc(4, 2, 21, 1, L, 5) ):

plots:-display(
  Histogram(MC[.., 1]),
  plot(PDF('Beta'(21+2, 1+4-2), z), z=(min..max)(MC[Burn..L, 1]), color=red, thickness=2)
);

sort(MC[Burn..L, 1])[round~([(L-Burn)*0.025, (L-Burn)*0.975])];
ScatterPlot(`<,>`($1..L), MC[.., 1], symbol=point);

memory used=186.49MiB, alloc change=0 bytes, cpu time=2.36s, real time=2.40s, gc time=0ns

 

 

Vector[column]([[.7445968635], [.9716363934]])

 

 

 

Download MCMC.mw

Your equation eq1[i, j] seems to be the finite difference ansatz for this PDE

'lambda'*diff(u(x, y), y) = (Gr/2)*diff(t(x, y), y) +
(Gr/2)*diff(c(x, y), y) 
+
('lambda'^2/2)*diff(u(x, y), [x, x, y]) - M*u(x, y)

(with a mass lumping discretization of u(x,y))
I didn't verify this but I believe it's the same thing for the other equations.

Maple can solve PDE systems (look the help page for pdsolve).
So, unless you truly want to code your own finite difference scheme (which is generally far from trivial to obtain a stable and convergent scheme, I mean when h and k both tend to 0), why wouldn't you start using pdsolve to see if it gets a solution ?

If pdsolve fails it will be always soon to write your discretization scheme

@dharr 

Nice, I vote up

@zenterix 

Did you read my answer and look to the attached file  (specifically the last example) ?

The error displayed in your worksheet  OPevents2.mw cannot be an error from this worksheet because dsolve contains no event !
It's likely that you removed events=[...] from the dsolve command after you got the error message.

Here is the proof and the way to implement as an event the condition 

[[0, 2 - x<0], [i(x) = i(x) + 1]]

# obviously equivalent to 

[[0, 2 - x = 0], [i(x) = i(x) + 1]]

# as x is monotonically increasing


 

restart

interface(version)

`Standard Worksheet Interface, Maple 2015.2, Mac OS X, December 21 2015 Build ID 1097895`

(1)

with(Statistics):

 


 
gen := proc (t) local hrand, start; setState(t); hrand := Sample(('Uniform')(1, 2)); return hrand(1)[1] end proc:

sys := diff(y(x), x) = z(x)*'gen(x)', diff(z(x), x) = y(x)*'gen(x)':

fcns := {y(x), z(x)}:

p := dsolve({sys, y(0) = 0, z(0) = 1}, fcns, type = numeric, method = rkf45, abserr = 0.1e-2, relerr = 0.1e-2); -1; odeplot(p, [[x, y(x)], [x, z(x)]], -4 .. 4, symbol = box)

 

p(3);

[x = 3., y(x) = HFloat(31.39034111150285), z(x) = HFloat(40.91343696274095)]

(2)

p := dsolve(
       {sys} union {diff(w(x), x)=1, w(0)=0, y(0) = 0, z(0) = 1, i(0)=1}
       , fcns union {w(x)}
       , type = numeric
       , method = rkf45
       , abserr = 1e-6
       , relerr = 1e-6
       , discrete_variables = [(i(x))::float]
       , events = [ [w(x)=2, i(x) = i(x)+1] ]
     ):
display(
  odeplot(p, [x, y(x)], -4 .. 4, color=blue, legend=typeset('y(x)')),
  odeplot(p, [x, z(x)], -4 .. 4, color=red , legend=typeset('z(x)'))
);
odeplot(p, [x, i(x)], -4 .. 4, color=blue, legend=typeset('i(x)'));

 

 

NULL

Download OPevents2_NoError_And_Event.mw

PS : change the values of abserr and relerr to see what happens

(Hope I do not interfere fith  @C_R)

Concerning the question about the plot not touching the x axis: this is just a problem of the precision of the drawing.

There is an option named numpoints whose, according to the help pages:
Specifies the minimum number of points to be generated.  The default is  200.  Note: plot employs an adaptive plotting scheme which automatically does more work where the function values do not lie close to a straight line.  Hence, plot often generates more than the minimum number of points.
Trying to set its value to an higher one may improve the rendering.

Nevertheless I believe that in this case it's far better to plot the whole function in a piecewise way:

  1. Isolate the zeros.
  2. Draw from one zero to the next one

Look to the attached file
Refined_Plots.mw

(Worksheet done with Maple 2015, newer versions could have advanced plot capabilities, look to the related help pages).

@Oliveira 

plottools:-getdata(...)[3] is the general way to get the matrix of the plotted points.
But what looks like a "single curve" is sometimes a collection of bunches of curve.
In your case intersectplot returns a set of curves while plot/plot3d returned only one curve.

Then you must apply what @acer did for each piece of curve:

L := [ plottools:-getdata(temp1) ]:
MyPoints := convert~(map2(op, 3, L), listlist);

Given the structure intersectplot builds (see below), a simpler way is

MyPoints := [op(op(temp1)[1])]

I believe a good practice before using plottools:-getdata is to "visualize the structure of the plot:

# compare this
op(de);
# to this
op(temp1);

I expect @acer will give more detailed explanations about that.

@acer

Thanks again for this new material.

"Fwiw, your original ... you don't like."
It seems that you have come full circle and that you have figured me out.

About Maple 2022: it should be my Christmas' gift (at the office of course)

@acer 

Do I really need the symbolic integration?

Clearly, if someone asked me what purpose it could serve, I would be hard pressed to answer.

One of my main domain of activity is reliability analysis. In a few words assessing extreme quantiles of a function (often a "computational code") of several random variables. Generally this involves using either extremely expensive Monte-Carlo simulations or very specific simulation methods.

So I was asked a few days ago about estimating the probability that a product of independent random variable exceeeds some given value. As the problem was simple I naturally proposed to use a crude Monte-Carlo approach.
But, for an unknown reason (I would say disconnected from pragmatic considerations if not irrational), my clients wanted to know the analytical expression of the pdf of the product variable (I guess it looks better in an official report if you say that the results come from an exact computation done with some CAS instead from naive Monte Carlo).
In any case I'm asked to work on the search of formal solution to the problem (note that having the closed form of the pdf doesn't mean that the desired probability will be computable analytically, for what we need is to invert the cdf, which is another problem!).
I said it felt like a waste of time, but...

So  Do I really need the symbolic integration?: NO, crude Monte-Carlo is enough.
Let's say the problem I submitted was more a kind of amusement someone asked me (ordered me?) to work on. And I got into the game as a good soldier.

--------------------------------------------------------------

In any case, I greatly appreciated the work you just delivered. Even, if you have understood it, its interest for the real problem remains marginal.

Probably having a closed form of the pdf is very cool. But his expression is not concise enough to be of practical interest: for me it remains the kind of thing you put in an appendix to show off. So you were right asking me for the need of symbolic integration, but still far from the appalling reality :-)

Very cool!

1 2 3 4 5 6 7 Last Page 2 of 91