MaplePrimes Posts

MaplePrimes Posts are for sharing your experiences, techniques and opinions about Maple, MapleSim and related products, as well as general interests in math and computing.

Latest Post
  • Latest Posts Feed
  • There is a bug in inttrans:-hilbert:

    restart;

    inttrans:-hilbert(sin(a)*sin(t+b), t, s);
    # should be:
    sin(a)*cos(s+b);   expand(%);

    sin(a)*cos(s)

     

    sin(a)*cos(s+b)

     

    sin(a)*cos(s)*cos(b)-sin(a)*sin(s)*sin(b)

    (1)

    ########## correction ##############

    `inttrans/expandc` := proc(expr, t)
    local xpr, j, econst, op1, op2;
          xpr := expr;      
          for j in indets(xpr,specfunc(`+`,exp)) do
              econst := select(type,op(j),('freeof')(t));
              if 0 < nops(econst) and econst <> 0 then
                  xpr := subs(j = ('exp')(econst)*combine(j/('exp')(econst),exp),xpr)
              end if
          end do;
          for j in indets(xpr,{('cos')(linear(t)), ('sin')(linear(t))}) do
              if type(op(j),`+`) then
                  op1:=select(has, op(j),t); ##
                  op2:=op(j)-op1;            ##
                  #op1 := op(1,op(j));
                  #op2 := op(2,op(j));
                  if op(0,j) = sin then
                      xpr := subs(j = cos(op2)*sin(op1)+sin(op2)*cos(op1),xpr)
                  else
                      xpr := subs(j = cos(op1)*cos(op2)-sin(op1)*sin(op2),xpr)
                  end if
              end if
          end do;
          return xpr
    end proc:

    #######################################

    inttrans:-hilbert(sin(a)*sin(t+b), t, s); expand(%);

    -(1/2)*cos(a-b)*sin(s)+(1/2)*sin(a-b)*cos(s)+(1/2)*cos(a+b)*sin(s)+(1/2)*sin(a+b)*cos(s)

     

    sin(a)*cos(s)*cos(b)-sin(a)*sin(s)*sin(b)

    (2)

     


    Download hilbert.mw

     

    To demonstrate Maple 2018’s new Python connectivity, we wanted to integrate a large Python library. The result is the DeepLearning package - this offers an interface to a subset of the Tensorflow framework for machine learning.

    I thought I’d share an application that demonstrates how the DeepLearning package can be used to recognize the numbers in images of handwritten digits.

    The application employs a very small subset of the MNIST database of handwritten digits. Here’s a sample image for the digit 0.

    This image can be represented as a matrix of pixel intensities.        

    The application generates weights for each digit by training a two-layer neural network using multinomial logistic regression. When visualized, the weights for each digit might look like this.

    Let’s say that we’re comparing an image of a handwritten digit to the weights for the digit 0. If a pixel with a high intensity lands in

    • an intensely red area, the evidence is high that the number in the image is 0
    • an intensely blue area, the evidence is low that the number in the image is 0

    While this explanation is technically simplistic, the application offers more detail.

    Get the application here

    Using Maple's native syntax, we can calculate the components of acceleration. That is, the tangent and normal scalar component with its respective units of measure. Now the difficult calculations were in the past because with Maple we solved it and we concentrated on the interpretation of the results for engineering. In spanish.

    Calculo_Componentes_Aceleracion_Curvilínea.mw

    Uso_de_comandos_y_operadores_para_calculos_de_componentes_de_la_aceleración.mw

    Lenin Araujo Castillo

    Ambassador of Maple

     

     

    The Maple splash screen needs a makeover, it's not too exciting so looking at the maplesoft website the opening screen has an image that would have been rather fitting for the Maple 2018 splash screen.  Here's the image I'm talking about.