$P3_HOME/p3manager_files/default/conf
AM Hostname | A unique name for the combination of the analysis application and physical host. It can be called anything but must be unique, for example nas68_venus. |
Physical Host | The physical host name where the analysis application will run. |
Type | The unique integer ID assigned to this type of analysis. This is automatically assigned by the program and the user should not have to worry about this. |
Path | How this machine can find the analysis application. For MSC Nastran, this is the runtime script (typically the nast68 file), for MSC.Marc, ABAQUS or GENERAL applications, this is the executable location. |
rcpath | How this machine can find the analysis application runtime configuration file: the MSC Nastran nast68rc file or the ABAQUS site.env file. This is not applicable to a MSC.Marc, GENERAL application and should be filed with the keyword NONE. |
Physical Host | Name of host machine for the use of the Analysis Manager |
Class | Machine type (RS6K, HP700, etc.) |
Max | Maximum allowable concurrent processes for this machine |
MaxAppTsk | Maximum application tasks. This is used say, if four MSC Nastran hosts are configured, but there are only enough licenses for three concurrent jobs. Without this the 4th job would always fail. With MaxAppTsk set to 3, the 4th job waits in the queue until one of the previous running jobs completes, then it gets submitted. It is ONLY present if the configuration file version is >=2. This is set with the VERS: or VERSION: field at the top of the file. |
Note: | The MaxAppTsk setting must be added manually. There is no widget in the AdmMgr to do this. If there are NO configuration files on start up of the AdmMgr, then it will set the version to 2 and use 1000 as the MaxAppTsk. If configuration files exist and version 2 is set, it will honor whatever is already there and pass them through. If version 1 is set, then MaxAppTsk is not written to the configuration files. |
Type | A number indicating analysis program type |
Prog_name | The name of the application job manager for this application |
Patran name | The name of the application which corresponds to the Patran analysis preference |
Options | Optional arguments for use with the GENERAL application. |
#------------------------------------------------------
# Analysis Manager host.cfg file
#------------------------------------------------------
#
#
# A/M Config file version
# Que Type: possible choices are P3, LSF, or NQS
#
VERSION: 2
ADMIN: am_admin
QUE_TYPE: MSC
#
#------------------------------------------------------
# AM HOSTS Section
#------------------------------------------------------
#
# Must start with a “P3AM_HOSTS:” tag.
#
# AM Host:
# Name to represent the choice as it will appear
# on the AM menus.
#
# Physical Host:
# Actual hostname of the machine to run the application on.
#
# Type:
# 1 - MSC.Nastran
# 2 - ABAQUS
# 3 - MSC.Marc
# 20 - User defined (General) application #1
# 21 - User defined (General) application #3
# etc. (max of 29)
#
# This field defines the application for this entry.
# Each value will have a corresponding entry in the
# “APPLICATIONS” section.
#
# EXE_Path:
# Where executable entry is made.
#
# RC_Path:
# Where runtime configuration file (if present) is found.
# Set to “NONE” if “General” application.
#
#------------------------------------------------------
# Physical Hosts Section
#------------------------------------------------------
#
# Must start with a “PHYSICAL_HOSTS:” tag.
#
# Class:
# HP700 - Hewlett Packard HP-UX
# RS6K - IBM RS/6000 AIX
# SGI5 - Silicon Graphics IRIX
# SUNS - Sun Solaris
# LX86 - Linux/
# WINNT - Windows
#
# Max:
#
# Maximum allowable concurrent tasks for this host.
#
#------------------------------------------------------
# Applications Section
#------------------------------------------------------
#
# Must start with a “APPLICATIONS:” tag.
#
# Type: See above for values
# Prog_name:
#
# The name of the Patran AM Task Manager executable to start.
#
# This field must be set to the following, based on the
# application it represents:
#
# MSC.Nastran -> NasMgr
# HKS/ABAQUS -> AbaMgr
# MSC.Marc -> MarMgr
# Any General App -> GenMgr
#
# option args:
#
# This field contains the default command line which will
# appear in the AM user interface configure menu. This
# field is only valid for user defined (General) applications.
# The command line can contain any text including any of the
# following keywords (which will be evaluated at runtime):
#
# $JOBFILE Actual filename selected (w/o full path)
# $JOBNAME Jobname ($JOBFILE w/o extension)
# $P3AMHOST Hostname of AM host
# $P3AMDIR Dir on AM host where $JOBFILE resides
# $APPNAME Application name (P3 preference name)
# $PROJ Project Name selected
# $DISK Total Disk space requested (mb)
#
#
# AM Host Physical Host Type EXE_Path RC_Path
#---------------------------------------------------------------------
P3AM_HOSTS:
Venus_nas675 | venus | 1 | /msc/msc675/bin/nast675 | /msc/msc675/conf/nast675rc |
Venus_nas68 | venus | 1 | /msc/msc68/bin/nast68 | /msc/msc68/conf/nast68rc |
Venus_aba53 | venus | 2 | /hks/abaqus | /hks/site/abaqus.env |
Venus_mycode | venus | 20 | /mycode/script | NONE |
Mars_nas68 | mars | 1 | /msc/msc68/bin/nast68 | /msc/msc68/conf/nast68rc |
Mars_aba5 | mars | 2 | /hks/abaqus | /hks/site/abaqus.env |
Mars_mycode | mars | 20 | /mycode/script | NONE |
#---------------------------------------------------------------------
#
#Physical Host Class Max
#--------------------------------------------------------------
PHYSICAL HOSTS:
venus | SGI4D | 2 |
mars | SUN4 | 1 |
#--------------------------------------------------------------
#
#
#Type Prog_name MSC P3 name MaxAppTsk [option args]
#--------------------------------------------------------------
APPLICATIONS:
#--------------------------------------------------------------
1 | NasMgr | MSC.Nastran | 3 | |
2 | AbaMgr | ABAQUS | 3 | |
3 | MarMgr | MSC.Marc | 3 | |
20 | GenMgr | MYCODE | 3 | -j $JOBNAME -f $JOBFILE |
#---------------------------------------------------------------------
# Analysis Manager disk.cfg file
#---------------------------------------------------------------------
#
# AM Host
#
# AM host from the host.cfg file “Patran AM_HOSTS” section.
#
# File System
#
# The filesystem directory
#
# Type
#
# The type of filesystem. If the filesystem is local
# to the machine, this field is left blank. If the
# filesystem is NFS mounted, the string “nfs” appears
# in this field.
#
# #
# AM Host File | System Type | (nfs or blank) |
#---------------------------------------------------------------------- | ||
Venus_nas675 | /user2/nas_scratch | |
Venus_nas675 | /venus/users/nas_scratch | |
# | ||
Venus_nas68 | /user2/nas_scratch | |
Venus_nas68 | /venus/users/nas_scratch | |
Venus_nas68 | /tmp | |
# | ||
Venus_aba53 | /user2/aba_scratch | |
Venus_aba53 | /venus/users/aba_scratch | |
Venus_aba53 | /tmp | |
# | ||
Venus_mycode | /tmp | |
Venus_mycode | /server/scratch | nfs |
# | ||
Mars_nas68 | /mars/nas_scratch | |
# | ||
Mars_aba5.2 | /mars/users/aba_scratch | |
Mars_aba5.2 | /tmp | |
# | ||
Mars_mycode | /tmp |
#---------------------------------------------------------------------
#------------------------------------------------------
# Analysis Manager lsf.cfg file
#------------------------------------------------------
#
# Below is the location (path) of the LSF executables (i.e. bsub)
#
QUE_PATH: /lsf/bin
QUE_OPTIONS:
QUE_MIN_MEM:
QUE_MIN_DISK:
#
# Below, each queue which will execute MSC tasks is listed.
# Each queue contains a list of hosts (from host.cfg) which
# are eligible to run tasks from the given queue.
#
# NOTE:
# Each queue can only contain one Host of a given application
# version(i.e., if there are two version entries for
# MSC.Nastran, nas67 and nas68, then each queue
# set up to run MSC.Nastran tasks could only include
# one of these versions. To be able to submit to
# the other version, create a separate, additional
# MSC queue containing the same LSF queue name, but
# referencing the other version)
#
#
TYPE: 1
#
TYPE: 2
#MSC Que | LSF Que | Hosts |
#--------------------------------------------------------- | ||
Priority_nas | priority | mars_nas675, venus_nas675 |
Normal_nas | normal | mars_nas675, venus_nas675 |
Night_nas | night | mars_nas675 |
#--------------------------------------------------------- |
#
#MSC Que | LSF Que | Hosts |
#--------------------------------------------------------- | ||
Priority_aba | priority | mars_aba53, venus_aba53 |
Normal_aba | normal | mars_aba53, venus_aba53 |
Night_aba | night | mars_aba53, venus_aba53 |
#--------------------------------------------------------- |
$P3_HOME/p3manager_files/org.cfg
org | The organizational group name |
master host | The host on which the Queue Manager daemon is running for the particular organizational group in question |
port # | The unique port ID used for this Queue Manager daemon. Each Queue Manager must have been started with the -port option. |
#------------------------------------------------------ | ||
# Patran ANALYSIS MANAGER org.cfg file | ||
#------------------------------------------------------ | ||
# | ||
# Org | Master Host | Port # |
#------------------------------------------------------ | ||
default | casablanca | 1500 |
atf | atf_ibm | 1501 |
lsf_atf | atf_sgi | 1502 |
support | umea | 1503 |
user1
user2
sippola
smith
Note: | Any user account that is configured in this manner must exist not only on the machine where the analysis is going to run, but also on the machine from which the job was submitted. |
SORT_ORDER: free_tasks cpu_util last_job_time avail_mem free_disk
GROUP: grp_nas2004
MIN_DISK: 10
MIN_MEM:: 5
MAX_CPU_UTIL: 95