Battery Data Publishing ########## This guide focuses on the Lucy CLI implementation of the BMS plugin, of which there is also an equivalent user friendlier GUI implementation within the desktop application. The BMS plugin is responsible for publishing Battery Lab Test data, typically provided via a BMS.ssv file but the data file(s) may have any name. If BMS.ssv is the BMS data file, included with it should be a .wr file which includes meta-data on the test. BMS Can also publish "elements" data in csv files, which have names like: 2021-03-31_HEAD_CATL-1017_167Ah_0C_BOT_Drives_data.csv The only difference in publishing the BMS.ssv data and Elements.csv data is in the filename pattern provided. Response names are 'mapped' and converted into more human-readable response names. The templates rely on successful mappings. Command Line Arguments ~~~~~~~~~~~~~~~~~~~~~~ .. list-table:: Command Line Args :widths: 30 30 30 30 30 30 30 :header-rows: 1 * - Name - Mapping (dest) - Default - Action - Metavar - Help - Group * - -d - recursive_directory - 'empty string' - NA - NA - Directory - NA * - -f - bms_filename - 'empty string' - NA - NA - BMS file name - NA * - -recursive - recursive - 'empty string' - store_true - NA - True if the directory should be searched recursively - NA * - -time-channel - time_channel - Test_Time - NA - NA - Name of the time channel - NA * - -drop-channel - drop_channel - False - NA - NA - Channels to drop - NA * - -rsp-per-file - responses_per_file - 'empty string' - NA - NA - Number of responses per file - NA * - -config-file - config_file - 'empty string' - NA - NA - config_file - NA * - -rm-repeating - rm_repeating - NA - store_true - NA - Remove repeating values (only the points between the start and end, non-inclusive, of any consecutively repeating points whose size is greater than 2) - NA * - -error-factor - error_factor - 0 - NA - NA - Error factor - NA * - -point-increment - point_increment - 'empty string' - NA - NA - Parse every point_increment values from bms data - NA * - -downsize - downsize - NA - store_true - NA - Use this flag to downsize the data points - NA * - -clean-initial-point - clean_initial_point - NA - NA - NA - Clean out the initial points from specified curves - NA * - -clean-final-zero - clean_trailing_zero - NA - store_true - NA - Clean out the trailing zero points - NA * - -timeout - timeout - 'empty string' - NA - NA - Number of seconds before killing the plugin - NA * - -program-filter - program_filter - 'empty string' - NA - NA - List of programs to filter based on - NA * - -database-file - database_file - 'empty string' - NA - NA - Database file - NA * - -xform-responses-per-call - response_chunk_size - 'empty string' - NA - NA - Database file - NA * - -pre-scan - prescan - NA - store_true - NA - Scan before running - Prescan * - -output-file - output_file - NA - NA - NA - Prescan output file - Prescan * - -scratch-dir - scratch_dir - . - NA - NA - Scratch directory - Prescan * - -thread-count - thread_count - 4 - NA - NA - Thread count - Prescan * - -publish - publish - NA - store_true - NA - Use this flag to publish a physical test to d3VIEW - Publishing * - -prescan-file - scan_database - NA - NA - NA - Pre scan data location - Publishing * - -post-files - post_files - NA - NA - NA - Post files - Publishing * - -d3view-url - api_url - NA - NA - api_url - Url used to navigate to d3VIEW. - Publishing * - -d3view-port - api_port - NA - NA - api_port - Port used to navigate to d3VIEW. - Publishing * - -u, -user - user_id - NA - NA - username - Name of the user as shown in d3VIEW - Publishing * - -a, -api-key - api_key - NA - NA - api_key - Api key corresponding to the d3VIEW user specified - Publishing * - -application-key - application_key - NA - NA - API_KEY - Application key - Publishing * - -program-name - program_name - 'empty string' - NA - NA - Name of the program associated with the physical test - Publishing * - -project-name - project_name - 'empty string' - NA - NA - Name of the project associated with the physical test - Publishing * - -template - response_template - 'empty string' - NA - NA - Template to apply - Publishing * - -template-mapping - template_mapping - NA - store_true - NA - Use the d3VIEW template to map data - Publishing * - -replace - remove_previous - NA - store_true - NA - Replace differing responses in existing physical test - Publishing * - -force-replace - force_remove_previous - NA - store_true - NA - Replace existing physical test - Publishing * - -verify - verify_responses - NA - store_true - NA - Verify local responses against existing physical test - Publishing * - -restart - restart - NA - store_true - NA - Restart using an extracted folder - Publishing * - -single-upload - single_upload - NA - store_true - NA - Upload responses one at a time - Publishing * - -skip-files - skip_files - NA - store_true - NA - Skip posting files - Publishing * - -ppt - ppt - NA - store_true - NA - Generate PowerPoint - Publishing * - -remove-duplicate-responses, -rdr - remove_duplicate_responses - NA - store_true - NA - Pre-Publish: Remove empty duplicate response curves before sending to d3VIEW. Post: Remove duplicated responses. - Publishing * - -update-overlays, -uo - update_overlays - 'empty string' - NA - NA - Deletes and then re-extracts existing overlays. - Updating * - -update-files - update_files - 'empty string' - NA - NA - Use this flag to apply a template to physical test in d3VIEW - Updating * - -apply-template - apply_template - NA - store_true - NA - Use this flag to apply a template to physical test in d3VIEW - Updating * - -meta-data - meta_data - NA - store_true - NA - Update only metadata for physical test - Updating * - -project-update - project_update - NA - store_true - NA - Update the Project for a given Physical Test - Updating * - -send-mail - send_mail - NA - store_true - NA - Sends a report of published/skipped tests via sendmail - Mail * - -from - from - 'empty string' - NA - NA - From email address for published/skipped test report - Mail * - -to - to - 'empty string' - NA - NA - To email address for published/skipped test report - Mail * - -prefix - prefix - NA - NA - NA - Prefix for published test name - Publishing * - -suffix - suffix - NA - NA - NA - Suffix for published test name - Publishing * - -mapping-file - mapping_file - 'empty string' - NA - NA - Mapping file path - Publishing * - -mapping-output - mapping_output - NA - store_true - NA - Generates output file for physical test mapping - NA * - -verbose - unencrypt - NA - store_true - NA - NA - NA * - -cache - cache - NA - store_true - NA - NA - NA * - -sanitize-responses - sanitize_responses - NA - store_true - NA - NA - NA * - -api-cleaner - api_response_cleaner - NA - store_true - NA - NA - NA * - -use-mapper - use_mapper - NA - store_true - NA - Specify whether to use the mapper file to map source and destination names. Typically this should always be provided - Publishing * - -use-mse - use_mse - NA - store_true - NA - Specify whether to use MSE-based mapping (even if CrossCheck column values contain 'yes' in the mapping file) - Publishing * - -rdp - rdp - 0.0 - NA - NA - Use RDP (Ramer-Douglas-Peucker) algorithm to reduce points for each response, argument is the epsilon value. The d3VIEW worker's default epsilon value is 1.0, an argument of 0.0 (default value) indicates to not use the RDP algorithm - Publishing * - -normalize - normalize - NA - store_true - NA - Use in conjunction with -rdp; applies min-max normalization (maps every y-value to a value within [0, 1]) - Publishing * - -keep-columns - keep_columns - '' - NA - NA - Comma-separated list of column names to keep from the data file. Only this list of columns will be used from he data file - Publishing * - -skip-columns - skip_columns - '' - NA - NA - Comma-separated list of column names to skip from the data file. This list of columns will not be used from he data file - Publishing * - -compare-tests - compare_tests - '' - NA - NA - Compare differences in responses between two fully published tests. Arguments shall be the test IDs of 2 physical tests separated by a comma - Publishing * - -compare-tests-metric - compare_tests_metric - 'raw' - NA - NA - Metric to use when comparing differences between tests. Calls curves_match worker. Default value is raw if this option is not provided. Other possible values are: pdtw, abs, max, max-max-diff, critical_resultant, euclidean-distance, frechet, dynamic-time-warping, ratio_ydiff - Publishing * - -compare-tests-output - compare_tests_output - '' - NA - NA - Compare differences in responses between two fully published tests. Use in conjunction with -compare-tests. This argument should be the path of the output file that highlights the differences - Publishing BMS Features ~~~~~~~~~~~~ .. list-table:: BMS Features :widths: 50 50 :header-rows: 1 * - Feature - Function * - -publish (and none of the below) - Extract responses, Publish Physical Test, Apply Template (if provided with -r 'template_id') * - -sanitize-responses - Clean responses for an existing test (Remove duplicates) * - -apply-template - Apply a template to an already existing test, or repair existing template responses * - -prescan - Generate a table of data about all scanned Tests, including if it has been published to d3VIEW or not * - -project-update - Change which "Project" a PT is saved under * - -meta-data - Update Meta Data for a published physical test * - -update-files - Update and repair any missing files associated with PT (example, .wr files) * - ELEMENTS (Publishing ELEMENTS .csv files) - The BMS plugin can alternatively be provided with a .csv containing 'Elements' data and process it. Key Arguments ~~~~~~~~~~~~~~ -d : Required. The directory to search for BMS.ssv (or 'pattern') files. -f : Required. The search pattern for the BMS file name. If, for instance, you have multiple element tests in a folder, you can set that folder to be -d and then set -f to be "\*.csv" to search for all csv files -recursive : Recommended. The flag to indicate that the .ssv files may be nested in sub-directories -time-channel : Required. The name of the time channel. -mapping-file : Provides the relationship mapping data to take output from the BMS.ssv and make it more human readable. Mapped responses are also required for templates to apply. -mapping-output : Optional, but will provide an overview of the mapped responses as .csv file -publish : without this, the results will be extracted, but we will not publish the PT -replace : Optional, replaces differing responses when comparing the local responses vs remote for a previously published test -force-replace : Deletes the original physical test, replaces with a new one Example Scripts ~~~~~~~~~~~~~~~ Typically you will always want to provide the -r, -d, -u (or, equivalently, -user), -a (or, equivalently, -api-key), -d3view-url, -application-key, -time-channel. You will also very likely want to include -mapping-file. Submit Script ~~~~~~~~~~~~~~ Here is an example of a simple submit script to extract and publish a BMS Physical Test. The key flag is -publish, and if you are republishing a test that already exists, -force-replace will overwrite the existing test with your newly published test. :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -user eric.tang \ -publish \ -post-files wr,Protocol.txt \ -prefix '01_' \ -scratch-dir ~/sandbox/data/scratch/ \ -mapping-file ~/sandbox/data/21F0048/BMS_MAPPING_03-08-2021_17.23.csv \ -mapping-output True \ -replace \ -remove-duplicate-responses Sanitize Responses ~~~~~~~~~~~~~~~~~~ This example demonstrates how to sanitize responses from existing, published tests. Response sanitize includes the ability to remove responses with null values and remove duplicate responses from publication. The -uo option takes Overlay Names and updates the composite expression, then re-extracts. :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -r 'template_id' \ -user will.wharton \ -sanitize-responses \ -remove-duplicate-responses \ -uo 'Cell Voltage Overlays,Cell Temperature Overlays,Cell Temperatures Overlays,Cell Voltages Overlay' Prescan ~~~~~~~~~~~~~~ Scans a test file to identify if it has been published to d3VIEW or not (among other details like status, total channels, total points, file size, and datetime) and outputs to a tsv. :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -r 'template_id' \ -user will.wharton \ -prescan \ -prescan-output 'dir_path' Apply Template ~~~~~~~~~~~~~~ Takes an existing test and applies template extractions / responses. :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -user eric.tang \ -apply-template \ -r 'template_id' Meta Data ~~~~~~~~~~~~~~ Updates tests based on the content of the .wr folder :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -user eric.tang \ -meta-data Update Files ~~~~~~~~~~~~~~ Looks for missing files for a test and will upload the missing files if present :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -user eric.tang \ -update-files Project Update ~~~~~~~~~~~~~~ Updates the test to sync with a project (by id) :: #!/bin/sh # Short file file=~/sandbox/data/21F0048/BMS.ssv ~/sandbox/lucymar11/bin/lucy plugins bms \ -d3view-url http://10.1.10.205 \ -d3view-port 3091 \ -a 'api-key' \ -application-key lucy \ -f ${file} \ -user eric.tang \ -project 174 \ -project-update Comparing Tests and Filtering Responses ~~~~~~~~~~~~~~ Compare the differences of the mutual responses shared between 2 physical tests, whether they be 2 existing published tests or one existing published test and the test that is to be published via execution of the current script (whose physical test ID is not yet known). The former can be executed by providing 2 physical test IDs separated by comma (remember to enclose in quotes if a space is present) and the latter can be done by just providing the single physical test ID to compare against. :: plugins bms -publish -user eric.tang -application-key lucy -a a320f40eb02ccd5f3942a4f10e26b490cd44f782 -d3view-url http://portal.d3view.com:3091 -mapping-file /Users/erictang/Desktop/lucy-source/lucy/test_data/bms/element_test_2/BMS_MAPPING.csv -d /Users/erictang/Desktop/lucy-source/lucy/test_data/bms/element_test_2 -f "*.\csv" -scratch-dir /Users/erictang/Desktop/lucy-source/lucy/test_data/bms/element_test_2/test_scratch -time-channel RecordingTime -timeout 2400 -error-factor 0 -thread-count 1 -rsp-per-file 250 -template 30025 -recursive -project-name BMS_project -clean-initial-point outlier -clean-final-zero -match-type -use-mapper -auto-map -post-files wr,Protocol.txt -verbose -compare-tests-metric pdtw -compare-tests "2325980, 2381534" -compare-tests-output /Users/erictang/Desktop/lucy-source/lucy/test_data/bms/element_test_2/compare_tests.json -rm-repeating -clean-initial-point outlier Cronjobs ^^^^^^^^ Example Cronjob Importer Series: - Morning Import - Scan for Missing Tests - Import Missing Tests - - Only imports tests flagged as missing during Scan - Afternoon Import - Night-time Import with -replace Example Import ~~~~~~~~~~~~~~ This script scans a directory for BMS test directories, copies the directories over based on date, and then publishes the data :: years=(2021 2020) from_dir_base=/mnt/uploads4/P/hevapps/uploads/WorkRequests/ to_dir_base="/TMP1/d3view/current/" debug="no" days=1 pattern=".*\.ssv" lucy_install=/cae/hpcjobs/service/lucy_may21 bms_import_dir_base=$to_dir_base scratch_dir=/TMP1/d3view/scratch/current bms_data_dir=/cae/hpcjobs/apurva/BMS/data/ bms_mapping_file=${bms_data_dir}/BMS_MAPPING_03-17-2021_16.55.csv bms_database_file=${bms_data_dir}/hevWrInfo.tsv d3VIEW_url="https://d3view-api.intra.chrysler.com:443" function transfer_file { from_file="$1" base_dir=`basename "$from_file"` to_dir="$to_dir_base""$2" echo " Copying ${from_file} to $to_dir ... " if [[ $debug = 'yes' ]]; then return; fi cp -r "$from_file" "$to_dir"/ return } function publish_to_d3view { file="$1" year="$2" echo " Publishing $file to d3VIEW..." if [[ $debug = 'yes' ]]; then return; fi "$lucy_install"/bin/lucy plugins bms -publish \ -user t5913sb \ -f ${pattern} \ -application-key lucy \ -d3view-url ${d3VIEW_url} \ -d "$bms_import_dir_base"/"$year"/"$file" \ -mapping-file "$bms_mapping_file" \ -scratch-dir ${scratch_dir} \ -time-channel Test_Time \ -rm-repeating \ -timeout 2400 \ -error-factor 0 \ -thread-count 1 \ -rsp-per-file 250 \ -template 9370 \ -project-name 174 \ -recursive \ -clean-initial-point all \ -clean-final-zero \ -post-files wr,Protocol.txt \ -verbose } for year in "${years[@]}"; do from_dir="$from_dir_base""$year" if [[ ! -d $from_dir ]]; then echo " No from dir: $from_dir" echo " Exiting..." break; fi parent_files=`find $from_dir ! -path $from_dir -type d -mtime -${days} -name "*[0-9]" | sort` for file in $parent_files; do SUB="_" folder=`basename "$file"` echo " Reviewing $file" if [[ "$folder" == *"$SUB"* ]]; then echo "iteration $folder"; IFS="_" read -a TOKENS <<< "$folder" echo $TOKENS parent_name=${TOKENS[0]} echo "parent_name is $parent_name" if [[ -d $from_dir/$parent_name && -e $from_dir/$parent_name/Completed.txt && -e $from_dir/$parent_name/BMS.ssv && $from_dir/$parent_name/*.wr ]] ; then transfer_file "$file" "$year" publish_to_d3view "$folder" "$year" else echo " Skipping $file" fi else if [[ -e $from_dir/$folder/Completed.txt && $from_dir/$folder/BMS.ssv && $from_dir/$folder/*.wr ]] ; then transfer_file "$file" "$year" publish_to_d3view "$folder" "$year" else echo " Skipping $file" fi fi done done Prescan ~~~~~~~~~~~~~~ Scans a directory of possible bms test sub-directories, outputs findings to -output-file - Note, Lucy appends .tsv to the output file after the fact :: #!/bin/sh years=(2021 2020) from_dir_base=/mnt/uploads4/P/hevapps/uploads/WorkRequests/ to_dir_base="/TMP1/d3view/current/" debug="no" days=3 pattern=".*\.ssv" lucy_install=/cae/hpcjobs/service/lucy_may21 bms_import_dir_base=$to_dir_base scratch_dir=/TMP1/d3view/scratch/current bms_data_dir=/cae/hpcjobs/apurva/BMS/data/ #bms_mapping_file=${bms_data_dir}/BMS_MAPPING_01-21-21_12-48.csv bms_mapping_file=${bms_data_dir}/BMS_MAPPING_03-17-2021_16.55.csv bms_database_file=${bms_data_dir}/hevWrInfo.tsv d3VIEW_url="https://d3view-api.intra.chrysler.com:443" cd /TMP1/d3view/scratch/current/cron_scratch mkdir $$ cd $$ "$lucy_install"/bin/lucy plugins bms \ -user t5913sb \ -f ${pattern} \ -application-key lucy \ -d3view-url ${d3VIEW_url} \ -pre-scan \ -output-file "${scratch_dir}"/missing_bms_tests \ -d "$bms_import_dir_base"/2021/ \ -mapping-file "$bms_mapping_file" \ -scratch-dir ${scratch_dir} \ -time-channel Test_Time \ -timeout 2400 \ -error-factor 0 \ -thread-count 1 \ -template 9370 \ -recursive \ -verbose mv lucy.log ../$$_lucy.log cd .. rm -r $$ Publish Missing ~~~~~~~~~~~~~~~ Publishes missing tests as identified by the prescan, requires the '-prescan-file' arg to point to the prescan tsv output :: #!/bin/sh years=(2021 2020) from_dir_base=/mnt/uploads4/P/hevapps/uploads/WorkRequests/ to_dir_base="/TMP1/d3view/current/" debug="no" days=1 pattern=".*\.ssv" #lucy_install=/cae/hpcjobs/apurva/lucy_jan06_21 #lucy_install=/cae/hpcjobs/apurva/lucy_april_28_2021 lucy_install=/cae/hpcjobs/service/test_lucy_june7 bms_import_dir_base=$to_dir_base scratch_dir=/TMP1/d3view/scratch/current bms_data_dir=/cae/hpcjobs/apurva/BMS/data/ #bms_mapping_file=${bms_data_dir}/BMS_MAPPING_01-21-21_12-48.csv bms_mapping_file=${bms_data_dir}/BMS_MAPPING_03-17-2021_16.55.csv bms_database_file=${bms_data_dir}/hevWrInfo.tsv d3VIEW_url="https://d3view-api.intra.chrysler.com:443" cd /TMP1/d3view/scratch/current/cron_scratch mkdir $$ cd $$ "$lucy_install"/bin/lucy plugins bms -publish \ -user t5913sb \ -f ${pattern} \ -application-key lucy \ -d3view-url ${d3VIEW_url} \ -mapping-file "$bms_mapping_file" \ -scratch-dir ${scratch_dir} \ -time-channel Test_Time \ -rm-repeating \ -prescan-file "${scratch_dir}"/missing_bms_tests.tsv \ -timeout 2400 \ -error-factor 0 \ -thread-count 1 \ -rsp-per-file 250 \ -template 9370 \ -project-name 174 \ -recursive \ -clean-initial-point all \ -clean-final-zero \ -post-files wr,Protocol.txt \ -verbose mv lucy.log ../$$_lucy.log cd .. rm -r $$ Elements (CSV) Submit Script Example ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This example includes a hardcoded file name (-f option argument), but a pattern can be used like with the .ssv BMS pattern :: "$lucy_install"/bin/lucy plugins bms -publish \ -user t5913sb \ -f "R_EL_U_20160709094157_CustRec.csv" \ -application-key lucy \ -d3view-url ${d3VIEW_url} \ -mapping-file "$bms_mapping_file" \ -d /cae/hpcjobs/users/t5913sb/incoming \ -replace \ -scratch-dir ${scratch_dir} \ -time-channel RecordingTime \ -rm-repeating \ -timeout 2400 \ -error-factor 0 \ -thread-count 1 \ -rsp-per-file 250 \ -template 9451 \ -recursive \ -clean-initial-point all \ -replace \ -clean-final-zero \ -post-files wr,Protocol.txt \ -verbose