Search found 12 matches

by Ozgun
Thu Jan 22, 2015 2:07 pm
Forum: OpenSees.exe Users
Topic: remove mp command issue
Replies: 2
Views: 2445

Re: remove mp command issue

i did not built it myself.I am using latest version of openseesSP.exe dowloaded from the website. and i am using my personal computer.
by Ozgun
Mon Jan 19, 2015 9:29 pm
Forum: OpenSees.exe Users
Topic: remove mp command issue
Replies: 2
Views: 2445

remove mp command issue

I am trying to remove equalDOF from the system i have but having the following error.

ShadowSubdomain::removeMP_Constraints() - not yet implemented.

I am using openseesSP. Is it possible to remove equalDOF ?

Thanks,

Ozgun
by Ozgun
Fri Dec 05, 2014 10:09 am
Forum: OpenSees.exe Users
Topic: problem with Manzari-Dafalias material
Replies: 11
Views: 9792

Re: problem with Manzari-Dafalias material

sanjianke wrote:
> Ozgun wrote:
> > can you try to play with alpha parameter and see if it works?
>
> I guess you meant the alpha value for defining SSPquadUP element. I just used the
> formula given in the manual to compute it as alpha = 0.25*(h^2)/(den*c^2). Maybe I
> can try to fix the issue by increasing or decreasing the alpha value as you
> suggested. Thanks.

I do not know if playing with alpha parameter is a correct way to fix the convergence issue. But, in my case it fixed. In the manual it is said to be a stabilizing parameter.
Again, i have to admit that I do not know if that is the correct way to deal with issues. It is just my observation. Hope somebody can explain the role of alpha parameter in more detail.
by Ozgun
Fri Dec 05, 2014 9:55 am
Forum: OpenSees.exe Users
Topic: problem with Manzari-Dafalias material
Replies: 11
Views: 9792

Re: problem with Manzari-Dafalias material

sanjianke wrote:
> Ozgun wrote:
> > mhm,
> >
> > I think it is not a suggested value. 0.05 is used in the generic examples
> > but you should use the Poisson's ratio that is valid for the problem you
> > are working on. otherwise your gravity analysis does not end up with
> > correct initial state of stress.
>
> If I follow your thought, I should use the specific initial shear modulus G0 in my
> case instead of the G0=125 in the generic example. Am I right?

Yes. You should use G0 value corresponding to your case. GO = 125 is just an example value.
by Ozgun
Tue Dec 02, 2014 7:27 pm
Forum: OpenSees.exe Users
Topic: problem with Manzari-Dafalias material
Replies: 11
Views: 9792

Re: problem with Manzari-Dafalias material

mhm,

I think it is not a suggested value. 0.05 is used in the generic examples but you should use the Poisson's ratio that is valid for the problem you are working on. otherwise your gravity analysis does not end up with correct initial state of stress.
by Ozgun
Mon Dec 01, 2014 11:05 am
Forum: OpenSees.exe Users
Topic: problem with Manzari-Dafalias material
Replies: 11
Views: 9792

Re: problem with Manzari-Dafalias material

can you try to play with alpha parameter and see if it works?
by Ozgun
Sun Nov 30, 2014 9:51 pm
Forum: Parallel Processing
Topic: UpdateMaterialStage in OpenSeesSP
Replies: 28
Views: 29075

Re: UpdateMaterialStage in OpenSeesSP

hi brag006,

Can you briefly explain the way you achieved to solve the problem you stated here?
I am running site response analysis.
Depending on the size of the problem, I am having the warning "WARNING: MaterialStageParameter::setDomain() - no effect with material tag 1". An this warning is for all of the elements I have in the model.

The interesting thing on using PDMYM with Mumps is that, even though I have this warning in some of the models,the results are okay. However, if i use Mumps for gravity analysis and SuperLU for transient, I am having a little bit distorted results with the same warning still staying there. If i use SuperLU in both gravity and transient analysis, even though the analysis are complete, I am ending up with rigid body motion (basically no relative acceleration).

In case I use Dafalias Manzari model, totally reverse observations are occuring. Using Mumps does not work at all. SuperLU ends up with rigid body motion, which is weird because I do not have any problem in sequential analysis.

Thanks for your response in advance,

Ozgun
by Ozgun
Fri Nov 28, 2014 1:31 pm
Forum: Parallel Processing
Topic: OpenSeesSP potential parallel computing problem
Replies: 0
Views: 2654

OpenSeesSP potential parallel computing problem

Hi,

I am using Manzari-Dafalias model with parallel processor. I have a very simple 1 column of soil on which I am doing gravity loading and dynamic time history analysis and it works in sequential computing without any problem. When I run it in parallel computing using SuperLU solver, I am getting the following warning in the beginning of the analysis.

"WARNING: MaterialStageParameter::setDomain() - no effect with material tag 1". And this warning appears for all of the material tags (in total 20).

I think that there is a problem with domain decomposition, or there may be problem with sending and receiving information between the cores. Does anybody have clearer explanation?

Also, while trying to do parallel computing, "NDMaterial::getTangent -- subclass responsibility" flows on the screen. Does anybody know what is the meaning of this?

Edit: In addition to the above stated issues, i found out that in case of using SuperLU with 4 cores for the case I have, the memory distribution is not balanced. The memory use of 1 core is 200 Mb, whereas the other 3 are using only 4 Mb of memory. In case of using Mumps solver, the memory distribution is equal but Mumps brings other bunch of problems in terms of convergence during gravity analysis (reported by others in too) therefore I am trying to avoid Mumps as much as possible.

In case of using Mumps, the error i get is following, (Note: the same analysis is working fine in sequential)

WARNING MumpsParallelSolver::solve(void)- Error -10 returned in substitution dmumps()
cause: Matrix is Singular Numerically
WARNING NewtonRaphson::solveCurrentStep() -the LinearSysOfEqn failed in solve()
DirectIntegrationAnalysis::analyze() - the Algorithm failed at time 5000
OpenSees > analyze failed, returned: -3 error flag
Gravity Elastic Execution time: 0 hours and 0 minutes and 1 seconds.
First run done.

Thank you,

Ozgun
by Ozgun
Mon Oct 06, 2014 11:32 am
Forum: Parallel Processing
Topic: Mumps Solver Problem
Replies: 4
Views: 7896

Re: Mumps Solver Problem

"why don't you mix the materials and elements to see where the problem is .."

- I will try to do that.
--------------------------------------------------------------------------------------------------------------------------
"those unbalanced forces do not look to be improving any so one would question whether you have actually achieved convergence with tolerances you are specifying"

- I am using penalty method for constraint handler. So I think, the current Norm, rather than Norm deltaR is the indication of convergence. Isn't that true ?
--------------------------------------------------------------------------------------------------------------------------
"if you are not providing a dT to the recorder the time step should be the same, as there is only 1 domain and it has only one time variable .. if the filess are different, are you sure the analysis is finishing and program closing (if not it might be just be to the fact that files are not been written to disk)"

- I am providing dT. And the issue is, when I run elastic and plastic gravity analysis in single processor with simple shear-beam model, recorders have no problem. (E.g: elastic gravity analysis using 20 steps with dT 5e3, ends up with 100000 sec which is correct. and stress conditions at the end of analysis are correct too.) But when I run the same model in parallel, stress recorders go up to 85000 and strain recorders go up to 95000. Stress and strain values are wrong. Program is not closing because I continue with transient analysis. but I put a command to warn me when the elastic gravity analysis is done. The values I provided above are after that warning comes to the screen which means gravity analysis is done.

thanks,

ozgun
by Ozgun
Sun Oct 05, 2014 2:00 pm
Forum: Parallel Processing
Topic: Dafalias Manzari constitutive model in OpenSeesSP
Replies: 6
Views: 8588

Re: Dafalias Manzari constitutive model in OpenSeesSP

Dear all,

I would like to ask if anybody used Dafalias-Manzari model with mumps solver. I am trying to use it and having issues about the speed and convergence.

I am linking another topic in order not to duplicate the issue.

http://opensees.berkeley.edu/community/ ... 24#p101624

Thanks,

Ozgun
by Ozgun
Sun Oct 05, 2014 1:13 pm
Forum: Parallel Processing
Topic: Mumps Solver Problem
Replies: 4
Views: 7896

Re: Mumps Solver Problem

fmk,

thanks for your reply. based on your suggestion, i have tried couple of different solvers with different time steps for gravity analysis. in some of trials i have kept both elastic and plastic stage for gravity analysis and for the others i have removed the plastic analysis just to be able to shift to time-history analysis. right now i have no convergence issue for the dynamic time history. as i stated at the first entry, even though i use mumps solver, the speed of computation is very very slow which is kind of unacceptable. the interesting thing i observed in my models and computations are :

- when i use pressure dependent multi yield model (PDMYM), neither in shear beam approach nor in 3D model i did not observe any convergence issue, and the results were perfectly meaningful with the given inputs. i used brickUP elements in the analyses when i use PDMYM.

- then i have shifted to Dafalias-Manzari model. in shear - beam approach (which is a simplification), i have used my previous script by changing the material and elements. instead of brickUP, i have used SSPbrickUP. and of course instead of PDMY i have used dafalias-manzari material model. for shear beam approach (which is a column of soil), everything is working perfectly with not much difference in the computation time. however once i run 3D model, the computation time got significantly slower compared to previous model (PDMYM with brickUP). the computation time ratio is almost 1 to 100 respectively.
below i am sharing an output to show that at each time step i am converging to solution

CTestNormDispIncr::test() - iteration: 1 current Norm: 1987.78 (max: 0.01, Norm deltaR: 12780.
CTestNormDispIncr::test() - iteration: 2 current Norm: 0.003479 (max: 0.01, Norm deltaR: 12780.9)

CTestNormDispIncr::test() - iteration: 1 current Norm: 1985.97 (max: 0.01, Norm deltaR: 12780.
CTestNormDispIncr::test() - iteration: 2 current Norm: 0.00352282 (max: 0.01, Norm deltaR: 12780.9)

One more thing : there is an issue about the recorders, i am recording the stresses and strains for the gravity analysis. for the shear beam model it is perfectly fine again. when i shift to 3D, the time step recorded in stress and time-step recorded in strains are different which should be the same i think. and also values are not making any sense. this made me think that there is an issue with recorders also. i believe that the way i put the recorder commands are correct because they are working well in my shear-beam models.


i am kind of got stuck at a point that i am not sure whether the material model (dafalias-manzari) or element (SSPbrickUP) itself is causing the computation time problem or i am doing something wrong. in my perspective, being able to run everything fine only except 3D model with dafalias - manzari model is showing that the problem is related to combination of using mumps solver - manzari-dafalias material model and SSP brickUP element. and of course right now this is only a speculation in my mind.

if you have any idea or suggestion please let me know,

thank you,

Ozgun
by Ozgun
Thu Oct 02, 2014 4:47 pm
Forum: Parallel Processing
Topic: Mumps Solver Problem
Replies: 4
Views: 7896

Mumps Solver Problem

Dear all,

I am using OpenSeesSP to take the advantage of parallel computing. I am using Mumps solver in my analyses. In my previous work, I have successfully run 3D simulations using brickUP elements with PressDependMultiYield model. Now I switched to Dafalias-Manzari model with SSPbrickUP element. First, I have run simple soil column using shear beam approach. It is working fine in single processor and the computation time difference is insignificant compared to brickUP element with PressDependMultiYield model. Once I switch to full 3D model I started to face couple of problem. One by one I have solved the problems however I got stuck at a point that I could not find any solution to proceed.

The following is the error I am getting from my 3D analysis :
--------------------------------------------------------------------------------------
WARNING: CTestNormDispIncr::test() - failed to converge
after: 200 iterations
AcceleratedNewton::solveCurrentStep() -The ConvergenceTest object failed in test()
DirectIntegrationAnalysis::analyze() - the Algorithm failed at time 100005
OpenSees > analyze failed, returned: -3 error flag
---------------------------------------------------------------------------------------

So, basically I have elastic - plastic gravity analysis followed by time-history analysis for seismic loading. Elastic stage of the analysis is running without problem. Seismic stage is running without problem. However, in plastic stage (in between elastic and seismic) I am having convergence issues. I have used ICNTL14 50. It solved the couple of problems, but the convergence problem in plastic stage is still causing trouble. In addition, although plastic stage does not converge, the time history is running without problem but is extremely slow (like I am using single processor- or less than that!).

The problem I am having is issued before by some other users and here is the link :

http://opensees.berkeley.edu/community/ ... php?t=5549


I am looking forward to hear from anybody who have an idea or suggestion.

Thank you very much.

Ozgun