ijuptilk

85 Reputation

2 Badges

0 years, 329 days

MaplePrimes Activity


These are replies submitted by ijuptilk

@acer 

It is but has not actually solved the problem for me. My supervisor says I should try this approach.

@Carl Love 

Okay, thank you so much

@Carl Love 

Find attached pdf file that explains, to compute w. The file also contains maple code for computing w.

How_to_compute_w.pdf

@Carl Love 

Thank you so much. I was expecting to see the result but I received:

simple_deflate := proc (A::Matrix, u::Vector) options operator, arrow; (proc (p) options operator, arrow; A-(u.A[p])/u[p] end proc)(max[index](`~`[abs](u))) end proc

mw2_(1).mw

From the attached file. I want to deflate. Sorry, how does it apply to the simple_deflate code?

@Carl Love 

Find attached the file

mw2.mw

@Carl Love 

I commented out but still not working.
 

restart;
with(LinearAlgebra):
doit := proc(A::Matrix)
    local Q, R, s, id;
    s := A[-1,-1];
    id := IdentityMatrix(Dimension(A));
    Q, R := QRDecomposition(A-s*id, fullspan);
    return R.Q + s*id;
end proc:
A[0] := Matrix(  [[3.0, 1.0, 4.0], [1.0, 2.0, 2.0], [0., 13.0, 2]] ):
for i from 1 to 6 do
    A[i] := doit(A[i-1]);
end do:
##Approximate eigenvalues
sort(Diagonal(A[i-1])):
Precise eigenvalues
Eigenvalues(A[0]):
##I assume that by "deflate" you mean something like this:
simple_deflate:= (A::Matrix, u::Vector)->
    (p-> A-u.A[p]/u[p])(max[index](abs~(u)));

Error, missing operator or `;`
 

@Carl Love 

Hi,

Thank you. But I used text to comment them out. Is not actually the problem for me. However, I will try what you said.

@Carl Love 

Thank you for your response. Please how can I apply the code to deflate A[6] from the code below: If you run the code below, you will have A0, A1, A3, ..., A6. But I want to deflate A6.  How do this: 

simple_deflate:= (A::Matrix, u::Vector)->
    (p-> A-u.A[p]/u[p])(max[index](abs~(u))):

apply? I'm using maple 18. The above doesn't work is throwing:  missing operator or `;`

restart;
with(LinearAlgebra):
doit := proc(A::Matrix)
    local Q, R, s, id;
    s := A[-1,-1];
    id := IdentityMatrix(Dimension(A));
    Q, R := QRDecomposition(A-s*id, fullspan);
    return R.Q + s*id;
end proc:
A[0] := Matrix(  [[3.0, 1.0, 4.0], [1.0, 2.0, 2.0], [0., 13.0, 2]] ):
for i from 1 to 6 do
    A[i] := doit(A[i-1]);
end do:
Approximate eigenvalues
sort(Diagonal(A[i-1])):
Precise eigenvalues
Eigenvalues(A[0]):

@Rouben Rostamian  

These are the steps for the deflating matrix in maple. I find it difficult to relate it to the problem i posted. 

I don't if this will help to understand what I meant by deflation.

simple_deflat:=proc(A,u)

       local p,ap;

# compute p value

       p:=pindex(u);

# pth row of A

      ap:=A[p];

# output

return A-(u.ap)/u[p];

end proc:

You see the attached note

notes08.pdf

@Rouben Rostamian

Thank you for your response. The Francis QR algorithm is as follows:

1. Set s=A_[nn]^(k) be the shift (bottom-right entry of A)

2. Compute the QR decomposition A^(k)-sI = QR, where I is the identity.

3. Next step, set A^(k+1) = RQ +sI.

Which means that:

A^(k)-sI = [[3.0, 1.0, 4.0], [1.0, 2.0, 2.0], [0., 13.0, 0]] - 2*[[1, 0, 0], [0, 1, 0], [0, 0, 1]] 

where s=2.

I don't know if the writer use this information to obtained the guess result:

A^0 = [[3.0, 1.0, 4.0], [1.0, 2.0, 2.0], [0., 13.0, 2]], 

A^1 = [[3.5,  -4.264, 0.2688], [-9.206, 1.577, 9.197], [0., -1.41, 1.923]], 

... A^6 = [[8.056,  1.596, 8.584], [0.3596, -2.01, -7.86], [0., 2.576*10^(-16), 0.9542]]. 

The writer given the above results. 

Another question is, how do I deflate A^(6) from the result computed?

Thank you.

@Kitonum 

Thank you so much. 

@acer 

Okay, thank you

@tomleslie 

I was able to fix it.  I deleted the maxsols and works perfectly. Another thing is the maple is displaying:

p[1] := 38.01625611/(Pi^2*(1-1.319446973/Pi^2))

instead of numbers only.

Thank you.

@tomleslie 

Hi,

I received this error message : Error, (in Student:-Calculus1:-Roots) unexpected option(s): maxsols = 200

When tried to run the other option.

@tomleslie 

Thank you so much.

4 5 6 7 Page 6 of 7