10. Automated Testing for Plugins

The automated tests written for the plugins in this guide are intended to automate the testing that was typically manual and time consuming. Tests are extremely useful for testing new updates and changes to plugin features and bug fixes. All manual processes are done programmatically. The plugins that support automated testing include BMS, CarTech, and HVAC (MDF-DAT sync plugin), among others. The results of the testing are stored in a csv file named Unit_Testing_Results.csv, located in the testing folder(s). Rows are the different tests and columns are the different testing criteria, with values being Pass or Fail. The first couple columns may be just informational data. Input in the form of a csv file is required and should be passed as the argument for the –input CLI option. Automated tests should be run as the following:

bin/lucy test unittest plugin_name --input /path/to/input.csv

where plugin_name should be any of bms, cartech, or hvac.

The input csv file (/path/to/input.csv), which can have any name, should look something like the below:

Sample input csv
File path Test ID to Compare Type
/Volumes/ET/lucy/test_data/bms/10_test_Element_EMEA_Overall/data/20220517_M182_B041_SW_2112A0_P4.5_10C_EPA_Test_CAN5_3.6V_Charge.csv 1111111 EMEA
/Volumes/ET/lucy/test_data/cartech/12_test_CarTech_Overall/data/2022-01-24_R02S5_CATL-3050_144Ah_CSmpl_25C_BOT_mRPT_data.csv 2222222 Cartech
/Volumes/ET/lucy/test_data/bms/10_test_Element_EMEA_Overall/data/V_EL_F_SW_2A1_10HZ_20210903034702_CustRec.csv 3333333 EMEA
/Volumes/ET/lucy/test_data/bms/10_test_Element_EMEA_Overall/data/20220517_M182_B041_SW_2112A0_P4.5_10C_EPA_Test_CAN5_Charge.csv 4444444 EMEA

10.1. BMS Plugin Testing

The two general kinds of BMS tests that are included in the automated testing are the tests containing BMS.ssv files and Protocol.txt and .wr files containing the meta data (“overall BMS tests”) and the tests containing meta data in [HEADER START] sections at the top of each .csv data file (EMEA tests). The former will be referenced as BMS type A tests and the latter as BMS type B tests.

BMS Type A

If the test folder is not yet in the correct path, it should be named 7_test_BMS_Overall and be placed in test_dasta/bms/7_test_BMS_Overall. The Unit_Testing_Results.csv columns along with their explanations are as follows:

Options Descriptions
Option Description
Test File Path Just the file path of the test data.
Publish args The configurations used in the plugin.
Meta Data Verification of meta data. Checks Test Name, Temperature[degC], Program, Serial #, and Supplier.
Published and Test Name Ensures the most recently published physical test is indeed the one that is being tested. Verifies the test name is as expected.
Original File Exists Verifies that the original data file is still present.
responses json File Exists Verifies that the responses json is still present. Side note: the responses json file(s) are required for the physical test to publish.
Protocol.txt, wr, and log files are attached Verifies that Protocol.txt, the .wr file containing meta data, and log files are attached to the physical test.
All Responses Present Checks that all of the response names are present as expected.
All Key Responses Present Checks that all the key responses (typically template responses) are present as expected.
Reference Response Data Matches Published Test Data Ensures that the raw response data for the set of reference responses match exactly against the reference responses csv files.
Test Benchmark Comparison Uses -compare-tests CLI argument and checks whether the total_error is equivalent to 0.0.

BMS Type B

If the test folder is not yet in the correct path, it should be named 10_test_Element_EMEA_Overall and be placed in test_dasta/bms/10_test_Element_EMEA_Overall.

Note that the plugin_name in the command in the “Automated Testing for Plugins” section above should still be bms, as this is still run using the bms plugin and the bms unit testing script.

The Unit_Testing_Results.csv columns along with their explanations are as follows:

Options Descriptions
Option Description
Test File Path Just the file path of the test data.
Publish args The configurations used in the plugin.
Published and Test Name Ensures the most recently published physical test is indeed the one that is being tested. Verifies the test name is as expected.
Original File Exists Verifies that the original data file is still present.
responses json File Exists Verifies that the responses json is still present. Side note: the responses json file(s) are required for the physical test to publish.
Meta Data Verification of meta data. Checks Test Name, Temperature[degC], and Program.
Log File Attached Verifies that a log file is attached to the physical test.
All Responses Present Checks that all of the response names are present as expected.
All Key Responses Present Checks that all the key responses (typically template responses) are present as expected.
Reference Response Data Matches Published Test Data Ensures that the raw response data for the set of reference responses match exactly against the reference responses csv files.
Test Benchmark Comparison Uses -compare-tests CLI argument and checks whether the total_error is equivalent to 0.0.

10.2. CarTech Plugin Testing

The CarTech tests that are supported in the automated testing are those in the 12_test_CarTech_Overall test folder, which must be located in test_dasta/bms/12_test_CarTech_Overall. This test folder contains 10 different tests, with 2 of these tests containing multiple resume data files.

The Unit_Testing_Results.csv columns along with their explanations are as follows:

Options Descriptions
Option Description
Test File Path Just the file path of the test data.
Publish args The configurations used in the plugin.
Published and Test Name Ensures the most recently published physical test is indeed the one that is being tested. Verifies the test name is as expected.
Original File Exists Verifies that the original data file is still present.
xy Files Uploaded Verifies that the .xy files for the required responses Ah, Current, JobModule#, Temp, and Voltage responses are all present as attached files in the published physical test.
Log File Uploaded Verifies that a log file is attached to the published physical test.
Comparing x-values Ensures that the raw response x-value data for the set of reference responses match exactly against the reference responses csv files.
Comparing y-values Ensures that the raw response y-value data for the set of reference responses match exactly against the reference responses csv files.
Meta Data Verification of meta data. Checks file name, misc, rxsy, date, battery_location, project, capacity, name, temp, and cell.
Test Benchmark Comparison Uses -compare-tests CLI argument and checks whether the total_error is equivalent to 0.0.

10.3. HVAC Plugin Testing

The HVAC tests that are included in the automated testing are the 6 tests denoted as “reference tests”. Test name should be Multiple_MDFs_test_1_outlier, and be located in test_data/HVAC/Multiple_MDFs_test_1_outlier. The reasoning behind the name is that a datapoint was modified so that it would be detectable by the outlier detector algorithm and removed.

Each test uses a different combination of the available plugin options so as to cast a wide net, so to speak. The following columns in Unit_Testing_Results.csv and their explanations:

Options Descriptions
Option Description
Test Folder Path Just the folder path containing the test data
Publish args The configurations used in the plugin
Unified DAT file exists Ensures the unified DAT file is present.
responses json File Exists Ensures the responses json file is present
Filtered Responses Ensures the responses present are only those as specified in the Keep DAT/MDF Columns filters (aside from the time channels and the DAT Channel & MDF Channel)
Published and Test Name Ensures the most recently published physical test is indeed the one that is being tested. Verifies the test name is as expected.
Published Response Names Equals JSON Response Names Checks that all the responses (which should be in the unified dat file) are present in responses.json and the published physical test