MaplePrimes Posts

MaplePrimes Posts are for sharing your experiences, techniques and opinions about Maple, MapleSim and related products, as well as general interests in math and computing.

Latest Post
  • Latest Posts Feed
  • pull up the context when we writing at the end of page.

     

    When at the end of the page,the context we see is at the bottom of the screen,and I've always enter some "Enter" to get more blank area at the bottom for a good view of my context.

     

    Anyone agree the suggestion?

    Or how to reach the purpose,for my possible ignorance of ready-made features

    In my previous posts I have discussed various difficulties encountered when writing parallel algorithms. At the end of the last post I concluded that the best way to solve some of these problems is to introduce a higher level programming model. This blog post will discuss the Task Programming Model, the high level parallel programming model introduced in Maple 13.

    Maple returns a message saying that the use of global parameters in
    the dsolve command is deprecated and will be eliminated in future
    versions of Maple. One should use the parameters option instead.

    I find entering parameters that way a bit clunky and prefer to use a
    function instead. Is there some good reason to use the parameters
    option? See below for a simple example.

    Unless you’ve spent the past five years on an isolated island in the middle of the Pacific, you’ll have heard of Facebook and Twitter and LinkedIn and MySpace and Flickr. Social media sites: whether you love them, hate them, or just don’t get them, they’re going to be here for a while. If you’re like many of us, you may have a few accounts on these sites, whether you’re a power user or occasional dabbler. Social media allow us to re-connect with old friends and colleagues, share our thoughts – and photos, advertise, network... and generally waste time. :)

    The evolution of written language started in earnest in 3500 BC with Cuneiform, spurring a step-change in the volume of information that could be recorded and transmitted over large distances.

    This evolved into wide spectrum of other methods of information transmission. The first transatlantic telegraph cables, for example, were laid in the mid-to-late nineteenth century by information pioneers – industrialists who saw the vast benefit in increasing the rate of information exchange by many orders of magnitude. This led to a Cambrian explosion in the sheer volume of information transmitted internationally, increasing trade and commerce to hitherto unseen levels.


    I seems that Maple doesn't know anything about the convexity of functions.

    It would be nice to have a command to check the convexity of (real) functions in Maple, also Maple should have knowledge on the convexity of known functions: for example: constant function, linear function , abs,  sin  (convex on a specific region) etc.
    To deal with the calculus of convex functions: for example Maple should know such theorems: if f(x) and g(x) are convex functions and g(x) is non-decreasing then g(f(x)) is convex, etc.

     

    I have been trying to calculate the Lyapunov exponent of the Rossler oscillator with an intention of finding it at higher precision. When i calculate the same at 16 digit precision on my 64 bit workstation or on my macbook using maple it takes much time (15 hrs have already been passed) and slows down the system badly. Can somebody plz help me out how to make my code more efficient and it runs in such a way that minimum resources of my computer are exploited.
    r := abs(z)^(Re(a)) * exp(-Im(a) * argument(z));
    w:= r * abs(z)^(Im(a)*I) * (z/abs(z))^Re(a);
    
    I want to see, that z^a = w.
    
    But simplify(w) gives a wrong result, it differs from w:
    
    tstData:= [z=1+3*I, a=-3+I];
    z^a; eval(%, tstData):  evalf(%);
    'w'; eval(%, tstData): evalf(%);
    'simplify(w)'; eval(%, tstData): evalf(%);
    
    tstData:= [z=-2*I, a=+I];
    z^a; eval(%, tstData):  evalf(%);
    'w'; eval(%, tstData): evalf(%);
    'simplify(w)'; eval(%, tstData): evalf(%);
    
    In the last case simplify(w) results in a purely real value,
    while w has a nonvanishing imaginary part.
    

    In my previous posts I discussed the basic difference between parallel programming and single threaded programming. I also showed how controlling access to shared variables can be used to solve some of those problems. For this post, I am going to discuss more difficulties of writing good parallel algorithms.

    Here are some definitions used in this post:

    • scale: the ability of a program to get faster as more cores are available
    • load balancing: how effectively work is distributed over the available cores
    • coarse grained parallelism: parallelizing routines at a high level
    • fine grained parallelism: parallelizing routines at a low level

    Consider the following example

    As of 9th of Oct 2009

    http://www.mapleprimes.com/mapleranking?sort=desc&order=Points
     

    Here is a strange behavior. I can understand that an integer and its float could be considered different, but the behavior should be the same in or out of a list. In addition it should not depend on the number of trailing zeros. In addition it should not depend on whether the integer is zero or not.

    I just want to reiterate how dynamic programming problems can be solved in Maple.

    Especially dynamic programming models that frequently appears in economic models.

    First of all it is important to note that is close to impossible to find an easy to understand

    and step-by-step road maps to dynamic programming. Why is that ?!  The below Maple

    code was basically "discovered" by trial and error and pure stubbornness (caveman 101).

     

    For the past few weeks I have not been receiving any of the auto-generated e-mails notifying me that someone has responded to a forum, blog, ... that I have contributed to. I checked my settings in MaplePrimes, and this feature is still active for my account. I've not found these messages in my junk folder.

    Is it just me, or are others not receiving these e-mails as well?

    Doug

    Another example of the FromMma conversion, from a recent forum post,

    In the previous post, I described why parallel programming is hard. Now I am going to start describing techniques for writing parallel code that works correctly.

    First some definitions.

    • thread safe: code that works correctly even when called in parallel.
    • critical section: an area of code that will not work correctly if run in parallel.
    • shared: a resource that can be accessed by more than one thread.
    • mutex: a programming tool that controls access to a section of code
    First 160 161 162 163 164 165 166 Last Page 162 of 307