error running parallel

This forum is for issues related to parallel processing
and OpenSees using the new interpreters OpenSeesSP and OpenSeesMP

Moderator: selimgunay

Post Reply
Unni Kartha G
Posts: 11
Joined: Sat Jul 18, 2009 11:16 pm
Location: Cochin University of Science and Technology
Contact:

error running parallel

Post by Unni Kartha G » Wed Oct 07, 2009 4:43 am

I had bult Opensees SP and tried out one of the examples, the cantilever problem, with a small modification.
I am getting the following error

[b]unni@dldap:~$ mpirun -np 2 OpenSeesSP/bin/OpenSeesSP canti.tcl [/b]
Master Process Running OpenSees Interpreter 0


OpenSees -- Open System For Earthquake Engineering Simulation
Pacific Earthquake Engineering Research Center -- 2.1.0

(c) Copyright 1999,2000 The Regents of the University of California
All Rights Reserved
(Copyright and Disclaimer @


[dldap:03972] *** An error occurred in MPI_Recv
[dldap:03972] *** on communicator MPI_COMM_WORLD
[dldap:03972] *** MPI_ERR_TRUNCATE: message truncated
[dldap:03972] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
Slave Process Running 1
MPI_Channel::recvID() - incorrect number of entries for ID received: 4 exptected: 6
WARNING ParallelNumberer::numberDOF - DOF_Group 96not in AnalysisModel!
[dldap:03975] *** Process received signal ***
[dldap:03975] Signal: Segmentation fault (11)
[dldap:03975] Signal code: Address not mapped (1)
[dldap:03975] Failing at address: (nil)
[dldap:03975] [ 0] [0xb7f6d40c]
[dldap:03975] [ 1] OpenSeesSP/bin/OpenSeesSP(_ZN33StaticDomainDecompositionAnalysis13domainChangedEv+0x65) [0x81ef135]
[dldap:03975] [ 2] OpenSeesSP/bin/OpenSeesSP(_ZN33StaticDomainDecompositionAnalysis7analyzeEd+0x3c) [0x81ef37c]
[dldap:03975] [ 3] OpenSeesSP/bin/OpenSeesSP(_ZN33StaticDomainDecompositionAnalysis7newStepEd+0x1d) [0x81eee4d]
[dldap:03975] [ 4] OpenSeesSP/bin/OpenSeesSP(_ZN14ActorSubdomain3runEv+0x13ca) [0x819008a]
[dldap:03975] [ 5] OpenSeesSP/bin/OpenSeesSP(_ZN13MachineBroker9runActorsEv+0xca) [0x82533fa]
[dldap:03975] [ 6] OpenSeesSP/bin/OpenSeesSP(main+0x96) [0x8179706]
[dldap:03975] [ 7] /lib/i686/cmov/libc.so.6(__libc_start_main+0xe5) [0xb79dc455]
[dldap:03975] [ 8] OpenSeesSP/bin/OpenSeesSP(_ZNSt8ios_base4InitD1Ev+0x59) [0x8178d31]
[dldap:03975] *** End of error message ***
WARNING AnalysisMode::getGroupGraph - 0 vertices, has the Domain been populated?
MPI_Channel::recvID() - incorrect number of entries for ID received: 2 exptected: 4
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3975 on node dldap exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

[b]the source input is as follows[/b]



# SET UP ----------------------------------------------------------------------------

wipe;

model basic -ndm 2 -ndf 3;

file mkdir data;



# define GEOMETRY -------------------------------------------------------------

# nodal coordinates:

node 1 0 0;

node 2 0 432
node 3 432 432




# Single point constraints -- Boundary Conditions

fix 1 1 1 1;



# nodal masses:

mass 2 5.18 0. 0.;



# Define ELEMENTS -------------------------------------------------------------



geomTransf Linear 1;



# connectivity: (make A very large, 10e6 times its actual value)

element elasticBeamColumn 1 1 2 3600000000 4227 1080000 1;
element elasticBeamColumn 2 2 3 3600000000 4227 1080000 1

element elasticBeamColumn 3 3 1 3600000000 4227 1080000 1

# Define RECORDERS -------------------------------------------------------------

recorder Node -file data/DFree.out -time -node 2 -dof 1 2 3 disp;

#recorder Node -file Data/DBase.out -time -node 1 -dof 1 2 3 disp;

#recorder Node -file Data/RBase.out -time -node 1 -dof 1 2 3 reaction;

#recorder Drift -file Data/Drift.out -time -iNode 1 -jNode 2 -dof 1 -perpDirn 2 ;

#recorder Element -file Data/FCol.out -time -ele 1 globalForce;

#recorder Element -file Data/DCol.out -time -ele 1 deformation;

# define GRAVITY -------------------------------------------------------------

pattern Plain 1 Linear {

load 2 0. -2000. 0.;

}

constraints Plain;

numberer Plain;

system SparseGeneral;

test NormDispIncr 1.0e-8 6 ;

algorithm Newton;

integrator LoadControl 0.1;

analysis Static

analyze 10;

loadConst -time 0.0;



# define LATERAL load -------------------------------------------------------------

# Lateral load pattern

pattern Plain 2 Linear {

load 2 2000. 0.0 0.0;

}



# pushover: diplacement controlled static analysis

integrator DisplacementControl 2 1 0.1;

analyze 1000;



puts "Done!"
unnikartha

fmk
Site Admin
Posts: 5883
Joined: Fri Jun 11, 2004 2:33 pm
Location: UC Berkeley
Contact:

Post by fmk » Mon Oct 12, 2009 3:54 pm

your model was too small. i have made changes so that in future you will just exit with a message saying the model was too small.

Post Reply