Mumps Solver Problem

This forum is for issues related to parallel processing
and OpenSees using the new interpreters OpenSeesSP and OpenSeesMP

Moderator: selimgunay

Post Reply
Ozgun
Posts: 12
Joined: Sun Dec 22, 2013 8:51 pm
Location: University of Illinois at Urbana-Champaign

Mumps Solver Problem

Post by Ozgun » Thu Oct 02, 2014 4:47 pm

Dear all,

I am using OpenSeesSP to take the advantage of parallel computing. I am using Mumps solver in my analyses. In my previous work, I have successfully run 3D simulations using brickUP elements with PressDependMultiYield model. Now I switched to Dafalias-Manzari model with SSPbrickUP element. First, I have run simple soil column using shear beam approach. It is working fine in single processor and the computation time difference is insignificant compared to brickUP element with PressDependMultiYield model. Once I switch to full 3D model I started to face couple of problem. One by one I have solved the problems however I got stuck at a point that I could not find any solution to proceed.

The following is the error I am getting from my 3D analysis :
--------------------------------------------------------------------------------------
WARNING: CTestNormDispIncr::test() - failed to converge
after: 200 iterations
AcceleratedNewton::solveCurrentStep() -The ConvergenceTest object failed in test()
DirectIntegrationAnalysis::analyze() - the Algorithm failed at time 100005
OpenSees > analyze failed, returned: -3 error flag
---------------------------------------------------------------------------------------

So, basically I have elastic - plastic gravity analysis followed by time-history analysis for seismic loading. Elastic stage of the analysis is running without problem. Seismic stage is running without problem. However, in plastic stage (in between elastic and seismic) I am having convergence issues. I have used ICNTL14 50. It solved the couple of problems, but the convergence problem in plastic stage is still causing trouble. In addition, although plastic stage does not converge, the time history is running without problem but is extremely slow (like I am using single processor- or less than that!).

The problem I am having is issued before by some other users and here is the link :

http://opensees.berkeley.edu/community/ ... php?t=5549


I am looking forward to hear from anybody who have an idea or suggestion.

Thank you very much.

Ozgun

fmk
Site Admin
Posts: 5883
Joined: Fri Jun 11, 2004 2:33 pm
Location: UC Berkeley
Contact:

Re: Mumps Solver Problem

Post by fmk » Fri Oct 03, 2014 7:53 am

the solver is not failing in the output you provide .. your model is just not converging .. try some different options in the script for when the analysis fails

Ozgun
Posts: 12
Joined: Sun Dec 22, 2013 8:51 pm
Location: University of Illinois at Urbana-Champaign

Re: Mumps Solver Problem

Post by Ozgun » Sun Oct 05, 2014 1:13 pm

fmk,

thanks for your reply. based on your suggestion, i have tried couple of different solvers with different time steps for gravity analysis. in some of trials i have kept both elastic and plastic stage for gravity analysis and for the others i have removed the plastic analysis just to be able to shift to time-history analysis. right now i have no convergence issue for the dynamic time history. as i stated at the first entry, even though i use mumps solver, the speed of computation is very very slow which is kind of unacceptable. the interesting thing i observed in my models and computations are :

- when i use pressure dependent multi yield model (PDMYM), neither in shear beam approach nor in 3D model i did not observe any convergence issue, and the results were perfectly meaningful with the given inputs. i used brickUP elements in the analyses when i use PDMYM.

- then i have shifted to Dafalias-Manzari model. in shear - beam approach (which is a simplification), i have used my previous script by changing the material and elements. instead of brickUP, i have used SSPbrickUP. and of course instead of PDMY i have used dafalias-manzari material model. for shear beam approach (which is a column of soil), everything is working perfectly with not much difference in the computation time. however once i run 3D model, the computation time got significantly slower compared to previous model (PDMYM with brickUP). the computation time ratio is almost 1 to 100 respectively.
below i am sharing an output to show that at each time step i am converging to solution

CTestNormDispIncr::test() - iteration: 1 current Norm: 1987.78 (max: 0.01, Norm deltaR: 12780.
CTestNormDispIncr::test() - iteration: 2 current Norm: 0.003479 (max: 0.01, Norm deltaR: 12780.9)

CTestNormDispIncr::test() - iteration: 1 current Norm: 1985.97 (max: 0.01, Norm deltaR: 12780.
CTestNormDispIncr::test() - iteration: 2 current Norm: 0.00352282 (max: 0.01, Norm deltaR: 12780.9)

One more thing : there is an issue about the recorders, i am recording the stresses and strains for the gravity analysis. for the shear beam model it is perfectly fine again. when i shift to 3D, the time step recorded in stress and time-step recorded in strains are different which should be the same i think. and also values are not making any sense. this made me think that there is an issue with recorders also. i believe that the way i put the recorder commands are correct because they are working well in my shear-beam models.


i am kind of got stuck at a point that i am not sure whether the material model (dafalias-manzari) or element (SSPbrickUP) itself is causing the computation time problem or i am doing something wrong. in my perspective, being able to run everything fine only except 3D model with dafalias - manzari model is showing that the problem is related to combination of using mumps solver - manzari-dafalias material model and SSP brickUP element. and of course right now this is only a speculation in my mind.

if you have any idea or suggestion please let me know,

thank you,

Ozgun

fmk
Site Admin
Posts: 5883
Joined: Fri Jun 11, 2004 2:33 pm
Location: UC Berkeley
Contact:

Re: Mumps Solver Problem

Post by fmk » Mon Oct 06, 2014 9:33 am

why don't you mix the materials and elements to see where the problem is ..
and those unbalanced forces do not look to be improving any so one would question whether you have actually achieved convergence with tolerances you are specifying.
as for the recorders .. if you are not providing a dT to the recorder the time step should be the same, as there is only 1 domain and it has only one time variable .. if the filess are different, are you sure the analysis is finishing and program closing (if not it might be just be to the fact that files are not been written to disk) .. try -closeOnWite option.

Ozgun
Posts: 12
Joined: Sun Dec 22, 2013 8:51 pm
Location: University of Illinois at Urbana-Champaign

Re: Mumps Solver Problem

Post by Ozgun » Mon Oct 06, 2014 11:32 am

"why don't you mix the materials and elements to see where the problem is .."

- I will try to do that.
--------------------------------------------------------------------------------------------------------------------------
"those unbalanced forces do not look to be improving any so one would question whether you have actually achieved convergence with tolerances you are specifying"

- I am using penalty method for constraint handler. So I think, the current Norm, rather than Norm deltaR is the indication of convergence. Isn't that true ?
--------------------------------------------------------------------------------------------------------------------------
"if you are not providing a dT to the recorder the time step should be the same, as there is only 1 domain and it has only one time variable .. if the filess are different, are you sure the analysis is finishing and program closing (if not it might be just be to the fact that files are not been written to disk)"

- I am providing dT. And the issue is, when I run elastic and plastic gravity analysis in single processor with simple shear-beam model, recorders have no problem. (E.g: elastic gravity analysis using 20 steps with dT 5e3, ends up with 100000 sec which is correct. and stress conditions at the end of analysis are correct too.) But when I run the same model in parallel, stress recorders go up to 85000 and strain recorders go up to 95000. Stress and strain values are wrong. Program is not closing because I continue with transient analysis. but I put a command to warn me when the elastic gravity analysis is done. The values I provided above are after that warning comes to the screen which means gravity analysis is done.

thanks,

ozgun

Post Reply