Fatigue Quick Start Guide > A Multiaxial Assessment > Determine the Critical Location
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX''">XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX''">   
Determine the Critical Location
You are ready to set up the fatigue analysis to determine the critical location. Copy the file knuckle_ma.fin to your directory. Instead of filling in the forms as we have done in all previous exercises, read in the fatigue job setup file. Of course you can still do it manually if you wish also.
Bring up the MSC Fatigue main setup form. Open the Job Control... form. Set the Action to Read Saved Job, select the job knuckle-ma, and click Apply. The parameters from the file are read in and all the widgets on the various forms are filled in.
Reading the saved job recovers all the information from the job and sets up the forms. One of the nice features of MSC Fatigue is the ease with which job files are handled. If you have a number of similar jobs to run you can simply change the job name and make any other edits before saving the job, and then repeating this process as often as necessary. If desired, all the resulting jobs may then be run in batch mode. This is the most efficient way of working if you have a lot of analyses to carry out.
What reading an old job file does not do however, is recreate the material group(s) used in the previous analysis just in case they have been removed or changed. Nor does it ensure that the referenced result cases actually exist. This is up to the user.
 
Hint:  
Another way to easily and conveniently read in an old job setup file is to type the jobname in the Jobname databox on the main form and press the carriage return. If a file called jobname.fin is detected in the local directory, it will be read. This can be more convenient than opening the Job Control... form.
The General Setup Parameters should appear as follows:
1. Analysis: Initiation
2. Results Loc.: Node
3. Nodal Ave.: Global
4. F.E. Results: Stress
5. Res. Units: MPa
6. Jobname: knuckle_ma
7. Title: Slalom on cobblestones, but with loads scaled by a factor of 13
Now open the various forms to see how the job has been setup.
Solution Parameters
Open the Solution Params... form.
1. Analysis Method: S-W-T
S-W-T (Smith-Topper-Watson) is a variant on the standard strain-life methodology which takes into account the mean stress of each cycle.
2. Plasticity Correction: Neuber
Neuber is the default elastic-plastic correction method.
3. Run Biaxiality Analysis: ON
This is the only real variant from previous examples.
4. Biaxiality Correction: None
This is the default correction method. Correction methods will be discussed later.
5. Stress/Strain Combination: Max. Abs. Principal
The Max. Abs. Principal is the default choice of Stress/Strain Combination. This is the principal strain that has the largest magnitude (in a uniaxial test, this would be the axial strain).
6. Certainty of Survival (%): 50.0
The Certainty of Survival (%) defaults to 50%. This means that the component will have a 50% chance of surviving the calculated life. The probability is based on the scatter defined in the material parameters.
7. Run Factor of Safety Analysis: OFF
Many components are designed for infinite life, e.g., crankshafts; these are better analyzed using a Run Factor of Safety Analysis. This is not covered by this demonstration.
Close the Solution Params... form when done.
Material Information
Open the Material Info... form. The original material of this component is not used in this example analysis. Instead a representative material, for illustration purposes only, has been selected.
The material information form is used to assign fatigue properties to different parts (groups) of the model. You can have up to 20 different groups which may have any combination of materials, surface finishes and treatments. Clicking on the material box gives access to a picklist of suitable materials from the database. Corrections can be made for surface finish and treatment. These are valid only for steels, though you can set up your own corrections if desired. In the case of this analysis, no correction is made, because the specimens were tested as-cast, i.e., with the same surface condition as the component. The region for this analysis is the group containing the surface nodes only. This speeds up the analysis.
The spreadsheet on the Material Info... form is filled out as follows:
1. Material: MANTEN
2. Finish: No Finish
3. Treatment: No Treatment
4. Region: KNUCKLE_ONLY
 
Note:  
You can change the Region to the group Surface_elements that you created earlier if you wish as long as the nodes exist in it also. A very common mistake that results in an error during translation is that the selected group does not contain nodes when a nodal fatigue analysis has been requested or the group does not contain elements when an element centroidal fatigue analysis has been selected.
Close the Material Info... form down when done. Always use the OK button when changes have been made. If you use the Cancel button, any changes will not be saved.
Loading Information
Copy the load variation signals to you local directory. They are called knuckle*.dac where * is a wild card for the twelve load cases.
Open the Loading Info... form.
Load the Time History Files
The spreadsheet appears filled out on the form but the actual time history files are not loaded into the time history database yet. Press the Time History Manager button to invoke PTIME.
When PTIME appears select the Load files option. A form will appear from which you can load all files at once. In the Source Filename databox type *.dac and press the Tab key so that the Target Filename databox automatically gets a wild card * placed there. Ignore any warning messages if there are any. Put something in the Description 1 databox such as Cobblestone Loads. Click the OK button accepting all the other defaults. The files will be loaded into the database. PTIME will show you a list of the new entries that it loaded.
Note:  
If you have been working sequentially through this document, then you may need to select Add an entry... before the option Load files is visible to you. In the Source Filename databox type knuckle*.doc, then proceed as described above.
One point ought to be made here. Nine of the 12 loads are forces in Newtons. The other three are Moments in Nmm. We loaded all files as Forces (N). In practice, this makes no difference at all to the analysis. The load type and units are simply labels. It is up to the user to make sure that the loading in the time history file and the loading in the FE model use consistent and compatible units regardless of how they are labelled.
Customized Loads and Units
Change the details of the three moments (KNUCKLE10, 11, and 12) using the Change an entry | edit Details option. Change the Load type to Moment then change the Units to Nmm. A problem you may encounter is that there may not be units defined as Nmm. Your choices could only be Nm or Ft lbs.
If you have access and privileges to modify the installation area of MSC Fatigue you can customize the load types and units. There are two files in
<install_dir>/mscfatigue_files/ptime (UNIX)
or on Windows:
x:\<install_dir>\mscfatigue_files\ptime (Windows)
called ltypes.ind and utypes.ind. You can edit these files to add your own load types and/or units if they do not exist. For instance, edit utypes.ind and add the following line at the bottom of the file:
92  11  0.001     0   Nmm
The first number indicates the unit type ID; the second is the load type ID defined in ltypes.ind that the units are associated to; the third defines the conversion from SI units (N, m); the forth is an offset; and the fifth is the common name. See the MSC Fatigue User’s Guide for more details.
If you are able to modify this file and wish to edit the details to change the moment units, you will have to stop and restart PTIME for it to recognize the changes.
If you don’t have access to modify these files then simply select Nm as the unit types since it will not make any difference to the resulting fatigue calculations.
 
Note:  
The only time that the actual load type and units are important is when you use the PTIME option Change an entry | Unit conversion to convert the selected time history to other units, although a compatibility check is made between the header of a .dac file and that specified in the jobname.fes file.
View the Time Histories
The loads have been derived from a single test-track event, namely a slalom on cobblestones. There are 12 load histories which correspond to the 12 FE static load cases. For example, knuckle09.dac is the Z vertical load on the strut mount corresponding to static FE load case 9. Loads are forces in Newtons (N) and Moments in Nmm.
Let us take a look at these time variations of the twelve load cases that are used in this example.With PTIME still running select the Multi-channel... | Display Histories option which will run the multi-file display module MMFD. Using the List facility select as many files as you would like to view. You can select all 12 but only eight will be visible at once. Use the Shift key to make multiple selection from the file browser. Note that the files will not appear in the databox but the number of files selected will appear below it. Accept all the other defaults on the form and click OK. The files will be displayed.
If you displayed more than eight, use the View | Scrn_Options | Next Scrn option to view the rest of the time histories. Exit from MMFD and PTIME when you are done.
The Load Association Spreadsheet
The spreadsheet in the Loading Info... form is used to establish the association between the load histories and the FE load cases. The other piece of information required is the FE load case load magnitude. This is used to ensure correct scaling of the stresses. In this analysis there are a couple of peculiarities. One is that the load case loads are set to 333 (N) instead of 1000. This effectively scales all the loads up by a factor of 3. This has been done to make the pictures prettier - the first pass analysis showed very little damage. The other peculiarity is the sign. This is due to a difference between the coordinate set used in the FE model and that in which the load histories were defined. The spreadsheet should be filled out accordingly.
 
 
Load Case ID
Time History
Load Magnitude
Row 1:
15.1-1.1-1- (Load Case 1)
KNUCKLE01
-333.
Row 2:
16.2-1.1-1- (Load Case 2)
KNUCKLE02
-333.
Row 3:
17.3-1.1-1- (Load Case 3)
KNUCKLE03
333.
Row 4:
18.4-1.1-1- (Load Case 4)
KNUCKLE04
-333.
Row 5:
19.5-1.1-1- (Load Case 5)
KNUCKLE05
-333.
Row 6:
20.6-1.1-1- (Load Case 6)
KNUCKLE06
333.
Row 7:
21.7-1.1-1- (Load Case 7)
KNUCKLE07
-333.
Row 8:
22.8-1.1-1- (Load Case 8)
KNUCKLE08
-333.
Row 9:
23.9-1.1-1- (Load Case 9)
KNUCKLE09
333.
Row 10:
24.10-1.1-1- (Load Case 10)
KNUCKLE10
-333.
Row 11:
25.11-1.1-1- (Load Case 11)
KNUCKLE11
-333.
Row 12:
26.12-1.1-1- (Load Case 12)
KNUCKLE12
333.
 
Hint:  
There is a toggle called Fill Down on the Loading Info... form. If you have many load cases, it becomes a tedious task to fill out each cell in the spreadsheet. If you turn this toggle ON when you select anything such as a Load Case ID or Time History, all cells below the active cell will also be filled in by selecting the next Load Case ID or Time History available. This is a very convenient tool.
 
Note:  
Depending on the coordinate system in which your stresses are defined, you may want or need to set the Transform to Basic option ON in the Loading Info... form. This will have the effect of transforming all results into the global system such that all results are in the same coordinate system before nodal averaging. This ensures proper nodal averaging and that the subsequent surface resolution will be as good as possible.
Note also that the Results Transformations is set to No Transformation. This is because the results are nodal and in surface resolved coordinates and we wish them to remain so. Close the Loading Info... form when done.
Job Control
If you do not want to wait for the analysis to run, copy the file knuckle_ma.fef to your directory and go on to the Evaluate Results. Otherwise, open the Job Control... form and set the Action to Full Analysis and click the Apply button to run the job. Monitor the job form time to time until the job is complete. Because of the complexity of the loading, the job takes a while to run.
 
Note:  
If you do not want to wait that long, you might want to do the fast analysis run instead.
Fast Analysis
Your analysis can be made to run faster by selecting the Simplified Analysis toggle in the Job Control... form for a multiple load case analysis and turning it ON. The analysis will perform peak-valley-slicing to reduce the time histories and run the analysis using these reduced time histories. This quickly identifies the nodes with the most damage and then the original time histories are used in a complete analysis on only the identified locations.
 
Note:  
This does make it more difficult to view the critical locations in the form of a contour plot because only the damaged locations are retained in a Simplified Analysis. The contour plot will not be continuous over the entire model.
CPU Times
There are certain thing that will affect the CPU time it takes to run a fatigue analysis. These are:
1. Number of Analysis Locations (Nodes or Elements). Selecting only a certain group of locations can certainly speed up the operation. Knowing which areas to include in the group(s) you create is the challenge if you do not know where the critical locations are before hand.
2. The Number of Load Cases. There is not much you can do about this. The number of load cases required is generally the number of load cases required. However you may be able to eliminate some load cases if they have no influence on the life.
3. The Number of Time History Points. The number of points in each time history is a significant factor. The longer the time histories, the more computationally intensive is the rainflow cycle counting procedure. Peak-valley-slicing can be used to reduce time histories and still retain the damaging events.
4. The Processor Speed. The final influence on the CPU time is the processor speed of course.
Peak-Valley-Slicing
The original load histories which were around 44,000 points, have been reduced using a multi-channel peak-valley-slicing program called MPVXMUL. They have been reduced down to around 1600 points.
Peak-valley-slicing is a fairly simple mechanism which tracks and extracts the peaks and the valleys of all signals to be used in an analysis. Whenever a peak or a valley is encountered in one of the signals, the corresponding points from the other signals are also retained. This procedure can be accomplished directly from PTIME using the Multi-channel... | Peak Valley Extract option, which will run MPVXMUL. You may wish to try this while the analysis is running.
Open PTIME from the Loading Info... form and invoke PTIME from the Time History Manager button. Then select the Multi-channel | Peak Valley Extract option. When MPVXMUL appears select DAC as the file type.
The next screen asks for the input files (channels). Accept all the defaults. The names of the files must have the same (generic) name in front of the channel numbers (KNUCKLExx.DAC). The output file names will have a .pvx extension.
The next screen is the Analysis Set-up where you specify by which method to do the slicing. Accept the defaults and see the MSC Fatigue User’s Guide for detailed descriptions of these methods.
Finally a spreadsheet is presented to you with the names and statistics of the signals to be sliced. There are two editable column, F and G. You must fill in one of these columns in order to affect a change in the original signals. In the first cell of column G (Gate %) enter 10 and press the carriage return. A 10 will appear under the File pull-down menu. Press the Copy button. This will copy 10 down the column for all the signals.
A percentage gate specifies a percentage of the total stress or strain range of the time history. For example if the largest range is 1000MPa and the gate is set to 10%, then any cycles encountered with ranges below this gate (100MPa) will be ignored. The program does not actually count cycles but during the course of the peak-valley extraction process, the number of turning points detected is restricted by imposing this hysteresis “gate”. This gate corresponds to the smallest difference between adjacent turning points that can be accepted. For turning points to be counted, they must be separated by a distance greater than the specified gate. By these means, small disturbances or “noise” in the time series may be “gated out” from the set of extracted turning points.
To perform the slicing, select File | OK. Load the files back into the Time History Database Manager by doing the Add an entry | Load files operation again but this time specifying *.pvx as the Source Filename. You may wish to use the Multi-channel... | Display Histories to compare the before and after files in MMFD. To the right shows the time history for the first load case where 49% gate has been used.
After you have finished with this exercise you may wish to rerun the analysis using the *.pvx files to see the difference in the speed of the analysis and the accuracy of the answers.