Analysis Manager > System Management > Analysis Manager Programs
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX''">XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX''">   
Analysis Manager Programs
The Analysis Manager is comprised of two main parts, the user interface and a number of daemons. Each of these parts and the executables are described below. All executables are found in the $P3_HOME/p3manager_files/bin directory, where $P3_HOME is the installation directory, typically /msc/patran200x.
User Interface
The first part of the Analysis Manager is the user interface from which a user submits and monitors the progress of jobs (P3Mgr is the executable name). This program can be executed in many different ways and from many different locations (i.e., either locally or remotely over a network). An administration tool also is available to easily set up and edit configuration files, and test for proper installation. (AdmMgr is the executable name on Unix. On Windows there is no separate executable; it is part of P3Mgr.) A small editor program (p3edit) is also part of the user interface portion and is invoked directly from the main user interface when editing and viewing files.
Two shell scripts are actually used to invoke the Analysis Manager and the administration tool. These are p3analysis_mgr and p3am_admin. When properly installed, these scripts automatically determine the installation path directory structure and which machine architecture executable to use.
Daemons
The second part of the Analysis Manager is a series of daemons (or services on Windows) which actually execute and control jobs. These daemons are responsible for queuing jobs, finding a host to run jobs, moving data files to selected hosts, executing the selected analysis code, etc. Each one is described here:
Queue Manager
This is a daemon (or service on Windows) which must run all the time (QueMgr executable name). The machine on which the Queue Manager runs is knows as the master host. Generally it runs as root (or administrator) and is responsible for scheduling jobs. The Queue Manager always has a complete account of all jobs running and/or queued. When a request to run a job is received, the Queue Manager checks to see what hosts are eligible to run the selected code and how many jobs each host is currently running. If there is a host which is eligible, the Queue Manager will start up the task on that host. If the Analysis Manager is installed along with a third party scheduling program (i.e., LSF or NQS) the Queue Manager is responsible for communicating with the scheduling software to control job execution. In summary, the Queue Manager is the Scheduler of the Analysis Manager environment. (Also, see Starting the Queue/Remote Managers, Starting the Queue Manager.)
Remote Manager
There is only one Queue Manager, but there are many Remote Managers. A RmtMgr process runs on each and every analysis machine. These are machines that are configured to run an analysis such as MSC Nastran or MSC.Marc. A RmtMgr can also be run on each submit machine (recommended - see Job Manager below). These are machines from which the analysis was submitted such as where Patran runs. If the submit and analysis machines are the same host, then only one RmtMgr needs to be running. The QueMgr and RmtMgr processes start up at boot time automatically and run always, but use very little memory and cpu resources, so users will not notice performance effects. Also these processes can run as root (Administrator on Windows) or as any user, if these privileges are not available.
Each RmtMgr binds to a known/chosen port number that is the same for every RmtMgr machine. Each RmtMgr process collects machine statistics on free CPU cycles, free memory and free disk space and returns this data to the QueMgr at frequent intervals. The RmtMgr is actually used to perform a command and return the output from that command on the host it is running. This is essentially a remote shell (rsh) host command as on a Unix machine.
 
Note:  
Its best to run the RmtMgr service on Windows as someone other than SYSTEM (the default if you do not do anything different). After installing the RmtMgr, use the control panel to access the services and then find the RmtMgr and change its startup to use a different account, something generic if it exists, or an Analysis Manager admin. account. If the RmtMgr is running as a user and not SYSTEM then the NasMgr/ MarMgr / AbaMgr/ GenMgr will run as this user and have access to Windows networking, shared drives and all. If it is run as SYSTEM then it is limited to only local Windows drives, shares, etc. The QueMgr does not do much in the way of files so running that as SYSTEM is OK.
Job Manager
The Job Manager (JobMgr executable name) runs for the life of a job. When a user submits a job using the Analysis Managerr, the user interface tells Queue Manager about the job and then starts a Job Manager daemon. The Job Manager daemon will receive and save job information from the Analysis Manager's user interface. The main purpose of the Job Manager is to record job status for monitoring and file transfer.
During the execution of jobs, users utilizing the Analysis Manager's user interface program can seamlessly connect to the Job Manager of their job and see what the status of the job is. In summary, the Job Manager controls the execution of a single job and is always aware of the current status of that job. The Job Manager runs on the submit host machine.
 
Note:  
On Windows if a RmtMgr is running on a local machine, the JobMgr will be started through it as usual, but if a RmtMgr is NOT running then a JobMgr will be started anyway, and the submit will still work fine. The only restriction is if, in this latter case, the user logs off, a popup dialog appears asking if the user really wants to logoff. The job will be terminated if he does. This will not happen if the RmtMgr is running as a service.
MSC Nastran Manager
The MSC Nastran Manager (NasMgr executable name) runs only for the life of a job. The MSC Nastran Manager is started by the Queue Manager when the task reaches the top of its queue and is eligible to run. The purpose of the MSC Nastran Manager is to run the MSC Nastran job. When the NasMgr first comes up, it generates FMS (if necessary), checks to see if there is enough disk space, etc. The NasMgr will make sure it has all of the files it needs for the job. If not, it will obtain them. Finally, the MSC Nastran job is started.
During execution, the NasMgr relays pertinent information (disk usage, cpu, etc.) to the Job Manager (JobMgr), which then updates the graphical information displayed to the user. The NasMgr is also responsible for cleaning up files and putting results back to desired locations, as well as reporting its status to the Job Manager. This daemon runs on the analysis host machine and only for the life of the analysis.
MSC.Marc Manager
The MSC.Marc Manager (MarMgr executable name) runs only for the life of a job. The MarMgr is identical in function to the MSC Nastran Manager (NasMgr) except it is for execution of MSC.Marc analyses.
ABAQUS Manager
The ABAQUS Manager (AbaMgr executable name) runs only for the life of a job. The AbaMgr is identical in function to the MSC Nastran Manager (NasMgr) except it is for execution of ABAQUS analyses.
General Manager
The General Manager (GenMgr executable name) runs only for the life of a job. The GenMgr is identical in function to the MSC Nastran Manager (NasMgr) except it is for execution of general analysis applications.
Editor
The editor (p3edit executable name) runs when requested from P3Mgr when viewing results files or editing the input deck.
Text Manager
The Text Manager (TxtMgr executable name) is a text based interface to the Analysis Manager to illustrate the Analysis Manager API. See Application Procedural Interface (API).
Job Viewer
The job viewer (Job_Viewer executable name) is a simple program available on UNIX platforms for opening and viewing job statistics for the Analysis Manager’s database file. This file is generally located in $P3_HOME/p3manager_files/default/log/QueMgr.rdb. You must run Job_Viewer and then open the file manually.
Analysis Manager Program Startup Arguments
AbaMgr, NasMgr, MarMgr, GenMgr
Started automatically by QueMgr (or NQS/LSF); no command line arguments.
JobMgr
Started automatically by P3Mgr/TxtMgr (or RmtMgr); no command line arguments.
RmtMgr
This is a daemon on Unix or a service on Windows and started automatically at boot time. Possible command line arguments (also see Organization Environment Variables):
 
Argument
Description
-version
Just prints Analysis Manager version and exits
-ultima
Switch to change P3_HOME to AM_HOME, p3manager_files to analysis_manager so there is no p3 in the environment required. (Generally not used!)
-port <####>
Port number to use. MUST be the SAME port number for ALL RmtMgr's for whole network (per QueMgr) default is 1800 if not set.
-path <path>
Use to specify base path for finding the Analysis Manager executables: $P3_HOME/p3manager_files/bin/{arch}/*Mgr.
<path> is the base path $P3_HOME. Default is to use same path as program was started up with, but in the case of "./RmtMgr ...." it will not work. If a full path is used to start RmtMgr (like in a startup script) then this is not needed.
-orgpath <path>
Use to specify base path for finding the Analysis Manager org tree (configuration files and directories): $P3_HOME/p3manager_files/{org}/{conf,log,proj}.
<path> is the base path $P3_HOME. Use to specify the base path to find the org tree only if different than the -path argument.
RmtMgr writes files in the proj/{projectname} directories, so if this is not the default (desired) location (same as -path above) then this argument needs to be set.
-name <name>
Windows only. Use if you want to run more than one RmtMgr service. Each must have a unique name so the start/stop method can distinguish which one to work with. Default <name> is MSCRmtMgr.
QueMgr (AdMgr)
This is a daemon on Unix or a service on Windows and started automatically at boot time. Possible command line arguments (also see Organization Environment Variables):
 
Note:  
On Unix, the AdmMgr (p3am_admin) accepts the same arguments as QueMgr.
 
Argument
Description
-version
Just prints Analysis Manager version and exits
-ultima
Switch to change P3_HOME to AM_HOME, p3manager_files to analysis_manager so there is no p3 in the environment required. (Generally not used!)
-port <####>
Port number to use. The default is 1900 if not set. If using an org.cfg file then use this argument with the -org option below to force a port number and org name.
-path <path>
Use to specify base path for finding the Analysis Manager executables: $P3_HOME/p3manager_files/bin/{arch}/*Mgr.
<path> is the base path $P3_HOME. Default is to use same path as program was started up with, but in the case of "./QueMgr ...." it will not work. If a full path is used to start QueMgr (like in a startup script) then this is not needed.
-orgpath <path>
Use to specify base path for finding the Analysis Manager org tree (configuration files and directories): $P3_HOME/p3manager_files/{org}/{conf,log,proj}.
<path> is the base path $P3_HOME. Use to specify the base path to find the org tree only if different than the -path argument.
RmtMgr writes files in the proj/{projectname} directories, so if this is not the default (desired) location (same as -path above) then this argument needs to be set.
-name <name>
Windows only. Use if you want to run more than one QueMgr service. Each must have a unique name so the start/stop method can distinguish which one to work with. Default <name> is MSCQueMgr.
-rmtmgrport <####>
The port number to use for ALL RmtMgr's that this QueMgr will connect to for the entire network. Default is 1800 (default RmtMgr -port value) if not set.
-rmgrport <####>
Same as -rmtmgrport above.
-org <org>
org name to use. This is the name of the directory containing the configuration files for this Queue Manager daemon (i.e., $P3_HOME/p3manager_files/{org}/{conf,log,proj}). The default is default. If using an org.cfg file then use this with the -port option above to force a port number and org name.
-delayint <###>
Default is 20 seconds. This is rarely used. Every delay_interval seconds the QueMgr asks another host in its list of all job hosts for a status and if it has not heard back from a host in (delay_interval * number_of_hosts * 3) + 30 seconds then it had been approximately 3 round trips through the host list without a response, so QueMgr marks the host as DOWN and will not submit new jobs to it, until it starts responding again. This is a flag to be able to modify that interval to account for network problems etc. which may be causing the Analysis Manager to think some hosts are down when they may not really be down.
P3Mgr
This program is started by the user. If 4 arguments are present then its assumed that:
Argument
Description
arg 1
Startup type..... It is one of the following:
1 - Start Up Full Interface.
2 - Start Up Queue Monitor Now.
3 - Start Up Abort Job Now.
4 - Start Up Monitor Running Job Now.
5 - Start Up Monitor Completed Job Now.
6 - Start Up Submit Now. (Submit current job)
7 - Start Up Submit Quiet. (Submit current job without GUI)
8 - Start Up Submit Quiet and wait for job to complete. (with exit status)
arg 2
Extension of the job input file.
arg 3
Job name (with optional path).
arg 4
Application type (integer).
1 - MSC Nastran
2 - ABAQUS
3 - MSC.Marc
20 through 29 - General (user defined applications)
If 4 more args are specified (Unix only) then its assumed:
 
Argument
Description
arg 5
X position of upper left corner of Patran right hand side interface in inches.
arg 6
Y position of upper left corner of Patran right hand side interface in inches.
arg 7
Width of Patran right hand side interface in inches.
arg 8
Height of Patran right hand side interface in inches.
The following arguments can be used alone or after the first 4 arguments above:
 
Argument
Description
-rcf <file>
rcf file to use for all GUI settings (same format as -env/-envall output) - see Analysis Manager Environment File.
-auth <file>
License file to use. Environment variable MSC_LICENSE_FILE is the default. This can also point to a port as well as a physical license file (with path), e.g., -auth 1700@banff
-env
Prints the rcf / GUI settings for all applications.
-envall
Same as -env but even more information is printed.
-extra <args>
Argument to add extra arguments to add on the end of a particular command line.
-runtype <#>
ABAQUS ONLY set run type to:
0 - full analysis
1 - restart
2 - data check
-restart <file>
ABAQUS ONLY - coldstart filename for restart.
-coldstart <file>
MSC Nastran ONLY - coldstart filename for restart. MSC.Marc uses the rcfile - see Analysis Manager Environment File.
TxtMgr
This program is started by the user to manage jobs through a simple text submittal program. Possible arguments are:
 
Argument
Description
-version
Same as RmtMgr.
-qmgrhost <hostname>
Hostname QueMgr is running on. Default is this one if no org.cfg is found.
-qmgrport <####>
Port QueMgr is running on. Default is 1900 if no org.cfg is found.
-rmgrport <####>
Port for ALL RmtMgr's for this org (QueMgr). Not needed unless using the Admin test feature and the default RmtMgr port is not being used.
-org <org>
org to use. Default is default.
-orgpath <path>
Same as RmtMgr. Needed for writing configuration files and/or Admin tests if it is not the default path (default is $P3_HOME).
-auth <file>
License file to use. Environment variable MSC_LICENSE_FILE is the default.
-app <name>
Application name to use. Default is MSC Nastran (or first valid app).
-rcf <file>
rcf file to use for all GUI settings (same format as -env/-envall output).- see Analysis Manager Environment File.
-p3home <path>
Switch to use if $P3_HOME environment variable is not set.
-amhome <path>
Switch to use if $AM_HOME environment variable is not set.
-choice <#>
Startup option if not full menu:
1) submit a job
2) abort a job
3) monitor a job
4) show QueMgr log file
5) show QueMgr jobs/queues
6) show QueMgr cpu/mem/disk
7) list completed jobs
8) write rcfile settings
9) admin test
10) admin reconfig QueMgr
-env
Prints the rcf / GUI settings for all applications.
-envall
Same as -env but even more info printed.
-envf <file>
Write env settings to file specified.
-envfall <file>
Same as -envf but even more info written.
-nocon
Do not attempt to connect to a QueMgr. Useful for when one is not running and you want to test the Admin configuration files, etc.
Analysis Manager Environment File
The -env and -envall argument to some of the above programs (P3Mgr in particular) will list the environment setting used in the Analysis Manager. The environment can be set by reading a particular file with the -rcfile argument. Default values of this environment are found in the .p3mgrrc file which gets stored in the users home directory when any settings are saved from within the Analysis Manager. Most all the widgets in the P3Mgr user interface can be set by reading an rcfile. When MSC.Marc jobs are submitted via Patran, all additional parameters such as restart filename, number of domains and host information for parallel processing and other information is passed to the Analysis Manager via this rcfile. There is an entry in the rcfile for each widget in the user interface. A list of a these entries is given below. Notice that the configuration information is also listed in the rcfile. Configuration information is explained in Configuration Management Interface and Examples of Configuration Files.
#
# rc file ---
#
cfg.total_h_list[0].host_name = tavarua
cfg.total_h_list[0].arch = HP700
cfg.total_h_list[0].maxtasks = 2
cfg.total_h_list[0].num_apps = 3
cfg.total_h_list[0].sub_app[MSC.Nastran].pseudohost_name = tavarua_nast2001
cfg.total_h_list[0].sub_app[MSC.Nastran].exepath = /solvers/nast2001/bin/nast2001
cfg.total_h_list[0].sub_app[MSC.Nastran].rcpath = /solvers/nast2001/conf/nast2001rc
cfg.total_h_list[0].sub_app[ABAQUS].pseudohost_name = tavarua_aba62
cfg.total_h_list[0].sub_app[ABAQUS].exepath = /solvers/hks/Commands/abaqus
cfg.total_h_list[0].sub_app[ABAQUS].rcpath = /solvers/hks/6.2-1/site/abaqus_v6.env
cfg.total_h_list[0].sub_app[MSC.Marc].pseudohost_name = tavarua_marc2001
cfg.total_h_list[0].sub_app[MSC.Marc].exepath = /solvers/marc2001/tools/run_marc
cfg.total_h_list[0].sub_app[MSC.Marc].rcpath = /solvers/marc2001/tools/include
cfg.total_h_list[1].host_name = salani
cfg.total_h_list[1].arch = WINNT
cfg.total_h_list[1].maxtasks = 2
cfg.total_h_list[1].num_apps = 3
cfg.total_h_list[1].sub_app[MSC.Nastran].pseudohost_name = salani_nast2001
cfg.total_h_list[1].sub_app[MSC.Nastran].exepath = d:\msc\bin\nast2001.exe
cfg.total_h_list[1].sub_app[MSC.Nastran].rcpath = d:\msc\conf\nast2001.rcf
cfg.total_h_list[1].sub_app[ABAQUS].pseudohost_name = salani_aba62
cfg.total_h_list[1].sub_app[ABAQUS].exepath = d:\hks\Commands\abq621.bat
cfg.total_h_list[1].sub_app[ABAQUS].rcpath = d:\hks\6.2-1\site\abaqus_v6.env
cfg.total_h_list[1].sub_app[MSC.Marc].pseudohost_name = salani_marc2001
cfg.total_h_list[1].sub_app[MSC.Marc].exepath = d:\msc\marc2001\tools\run_marc.bat
cfg.total_h_list[1].sub_app[MSC.Marc].rcpath = d:\msc\marc2001\tools\include.bat
#
unv_config.auto_mon_flag = 1
unv_config.time_type = 0
unv_config.delay_hour = 0
unv_config.delay_min = 0
unv_config.specific_hour = 0
unv_config.specific_min = 0
unv_config.specific_day = 0
unv_config.mail_on_off = 0
unv_config.mon_file_flag = 1
unv_config.copy_link_flag = 0
unv_config.job_max_time = 0
unv_config.project_name = user1
unv_config.orig_pre_prog =
unv_config.orig_pos_prog =
unv_config.exec_pre_prog =
unv_config.exec_pos_prog =
unv_config.separate_user = user1
unv_config.p3db_file =
unv_config.email_addr = empty
#
nas_config.disk_master = 0
nas_config.disk_dball = 0
nas_config.disk_scratch = 0
nas_config.disk_units = 2
nas_config.scr_run_flag = 1
nas_config.save_db_flag = 0
nas_config.copy_db_flag = 0
nas_config.mem_req = 0
nas_config.mem_units = 0
nas_config.smem_units = 0
nas_config.extra_arg =
nas_config.num_hosts = 2
nas_host[tavarua.scm.na.mscsoftware.com].mem = 0
nas_host[tavarua.scm.na.mscsoftware.com].smem = 0
nas_host[tavarua.scm.na.mscsoftware.com].num_cpus = 0
nas_host[lalati.scm.na.mscsoftware.com].mem = 0
nas_host[lalati.scm.na.mscsoftware.com].smem = 0
nas_host[lalati.scm.na.mscsoftware.com].num_cpus = 0
nas_config.default_host = tavarua_nast2001
nas_config.default_queue = N/A
nas_submit.restart_type = 0
nas_submit.restart = 0
nas_submit.modfms = 1
nas_submit.nas_input_deck =
nas_submit.cold_jobname =
#
aba_config.copy_res_file = 1
aba_config.save_res_file = 0
aba_config.mem_req = 0
aba_config.mem_units = 0
aba_config.disk_units = 2
aba_config.space_req = 0
aba_config.append_fil = 0
aba_config.user_sub =
aba_config.use_standard = 1
aba_config.extra_arg =
aba_config.num_hosts = 2
aba_host[tavarua.scm.na.mscsoftware.com].num_cpus = 1
aba_host[tavarua.scm.na.mscsoftware.com].pre_buf = 0
aba_host[tavarua.scm.na.mscsoftware.com].pre_mem = 0
aba_host[tavarua.scm.na.mscsoftware.com].main_buf = 0
aba_host[tavarua.scm.na.mscsoftware.com].main_mem = 0
aba_host[lalati.scm.na.mscsoftware.com].num_cpus = 1
aba_host[lalati.scm.na.mscsoftware.com].pre_buf = 0
aba_host[lalati.scm.na.mscsoftware.com].pre_mem = 0
aba_host[lalati.scm.na.mscsoftware.com].main_buf = 0
aba_host[lalati.scm.na.mscsoftware.com].main_mem = 0
aba_config.default_host = tavarua_aba62
aba_config.default_queue = N/A
aba_submit.restart = 0
aba_submit.aba_input_deck =
aba_submit.restart_file =
#
mar_config.disk_units = 2
mar_config.space_req = 0
mar_config.mem_req = 0
mar_config.mem_units = 2
mar_config.translate_input = 1
mar_config.num_hosts = 2
mar_host[tavarua.scm.na.mscsoftware.com].num_cpus = 1
mar_host[lalati.scm.na.mscsoftware.com].num_cpus = 1
mar_config.default_host = tavarua_marc2001
mar_config.default_queue = N/A
mar_config.cmd_line =
mar_config.mon_file = $JOBNAME.sts
mar_submit.save = 0
mar_submit.nprocd = 0
mar_submit.datfile_name =
mar_submit.restart_name =
mar_submit.post_name =
mar_submit.program_name =
mar_submit.user_subroutine_name =
mar_submit.viewfactor =
mar_submit.hostfile =
mar_submit.iamval =