Evaluation of Fast Charging Efficiency under Extreme Temperatures

Multi-type fast charging stations are being deployed over Europe as electric vehicle adoption becomes more popular. The growth of an electrical charging infrastructure in different countries poses different challenges related to its installation. One of these challenges is related to weather conditions that are extremely heterogeneous due to different latitudes, in which fast charging stations are located and whose impact on the charging performance is often neglected or unknown. The present study focused on the evaluation of the electric vehicle (EV) charging process with fast charging devices (up to 50 kW) at ambient (25 ◦C) and at extreme temperatures (−25 ◦C, −15 ◦C, +40 ◦C). A sample of seven fast chargers and two electric vehicles (CCS (combined charging system) and CHAdeMO (CHArge de Move)) available on the commercial market was considered in the study. Three phase voltages and currents at the wall socket, where the charger was connected, as well as voltage and current at the plug connection between the charger and vehicle have been recorded. According to SAE (Society of Automotive Engineers) J2894/1, the power conversion efficiency during the charging process has been calculated as the ratio between the instantaneous DC power delivered to the vehicle and the instantaneous AC power supplied from the grid in order to test the performance of the charger. The inverse of the efficiency of the charging process, i.e., a kind of energy return ratio (ERR), has been calculated as the ratio between the AC energy supplied by the grid to the electric vehicle supply equipment (EVSE) and the energy delivered to the vehicle’s battery. The evaluation has shown a varied scenario, confirming the efficiency values declared by the manufacturers at ambient temperature and reporting lower energy efficiencies at extreme temperatures, due to lower requested and, thus, delivered power levels. The lowest and highest power conversion efficiencies of 39% and 93% were observed at −25 ◦C and ambient temperature (+25 ◦C), respectively.


Introduction
Under certain conditions, e-mobility may represent a great promise for environmental protection and future economic growth. Electrification of the transportation sector is a crucial means to achieve 2020 and 2030 targets for the reduction of the economic oil dependency and of carbon emissions from transportation. However, the expected 100 million electric vehicles foreseen by 2030 in the Paris declaration on Electro-mobility and Climate Change [1] are far from the over 750,000 sales worldwide in 2016 [2]. The reason for this still low market share of BEVs (battery electric vehicles) is, in addition to the higher costs of electric vehicles compared to conventional ones, to the limited driving ranges,

•
The state of health (SoH) of a battery, which indicates the specified performance and health condition of a used battery compared with a new battery of the same type.

•
The state of function (SoF), an indicator of the performance of a battery during operation [6].
Fast charging technologies have been more frequently studied from the point of view of its standardisation, of its usage, and market penetration. Many studies have also been conducted regarding the potential impact of vehicle electrification in power distribution systems.
Differences between charging systems, like nominal voltage, speed of charging, and plug type, are highlighted in [7], while in [8] standard and fast-charging methods for electric vehicles are compared. This comparison claims an increase of power losses during the fast charging process and underlines the possibility that a growing infrastructure of fast-charging stations could lead to inefficient management with electrical energy. However, it is to be considered that, during AC charging, the conversion losses happen inside the EV. From first principles, DC charging should be a more energetically efficient solution when charging EVs at rates much over 20 kW. Additionally, in terms of use intensity over life time, it should be more convenient to deploy the investment cost of a rectification unit in a public charger rather than in every private EV.
Authors in [9,10] foresee that a large deployment of EVs could result in a violation of supply/ demand matching and statutory voltage limits and also in power quality problems and voltage imbalance under certain operating conditions. This should be taken into account considering the future growth of the amount of energy delivered by the rolled-out EV infrastructure. For instance, chargers of the "Rapid Charge Network", one of the European Commission's co-financed projects [11], have delivered around 300 MWh between July 2014 and November 2015, but such values are going to strongly rise soon. Nevertheless, studies about fast charging usage [3] and its market penetration, [12] underline that the availability of an adequate public infrastructure is crucial to accelerate the electrification of transports.
The aim of this experimental activity is to provide scientific evidence and present results of real use cases of the practical aspects and issues of fast charging process, in terms of power efficiency and dependency on extreme temperature variations. It is believed that our findings can benefit citizens', investors' and policy-makers' perceptions toward the improvement of such technologies and raise their awareness of the undercover challenges. Real experimental data about the assessment of the charging process [13], considering the whole system constituted by grid, charger, and vehicle are not widely available, because the process requires collecting test samples (chargers and vehicles) from various manufacturers globally. In addition to this, the measurement equipment and the test facilities, such as climatic chambers which can employ a wide temperature range, are not easily available even for automotive makers. Our work considered all these challenges and the aim was to gain an understanding into the energy performance of the vehicle charging systems under various environmental (temperature) conditions.

Measurements Set-Up
A test campaign on several fast chargers in combination with different vehicles has been carried out in a climatic chamber varying the temperature from −25 • C to +40 • C.
Seven commercial fast-charging columns have been tested with two EVs in the VeLA 8 laboratory of the European Interoperability Centre for electric vehicles and smart grids at the Joint Research Center (JRC). This is a climatic test cell designed to test light-to medium-duty battery-electric, hybrid or fuel cell electric vehicles and chargers at temperatures in the range from −30 • C to +50 • C, at controlled humidity. The test set-up is shown in Figure 1.
Energies 2017, 10, x FOR PEER REVIEW 3 of 13 [11], have delivered around 300 MWh between July 2014 and November 2015, but such values are going to strongly rise soon. Nevertheless, studies about fast charging usage [3] and its market penetration, [12] underline that the availability of an adequate public infrastructure is crucial to accelerate the electrification of transports. The aim of this experimental activity is to provide scientific evidence and present results of real use cases of the practical aspects and issues of fast charging process, in terms of power efficiency and dependency on extreme temperature variations. It is believed that our findings can benefit citizens', investors' and policy-makers' perceptions toward the improvement of such technologies and raise their awareness of the undercover challenges. Real experimental data about the assessment of the charging process [13], considering the whole system constituted by grid, charger, and vehicle are not widely available, because the process requires collecting test samples (chargers and vehicles) from various manufacturers globally. In addition to this, the measurement equipment and the test facilities, such as climatic chambers which can employ a wide temperature range, are not easily available even for automotive makers. Our work considered all these challenges and the aim was to gain an understanding into the energy performance of the vehicle charging systems under various environmental (temperature) conditions.

Measurements Set-Up
A test campaign on several fast chargers in combination with different vehicles has been carried out in a climatic chamber varying the temperature from −25 °C to +40 °C.
Seven commercial fast-charging columns have been tested with two EVs in the VeLA 8 laboratory of the European Interoperability Centre for electric vehicles and smart grids at the Joint Research Center (JRC). This is a climatic test cell designed to test light-to medium-duty batteryelectric, hybrid or fuel cell electric vehicles and chargers at temperatures in the range from −30 °C to +50 °C, at controlled humidity. The test set-up is shown in Figure 1. According to the standard IEC (International Electro technical Commission) 61851-1 ed.3.0 [14] environmental tests are foreseen for EV conductive charging systems. The DC electric vehicle charging station shall operate at its nominal voltage with maximum output and current within the temperature range −25 °C to +40 °C for outdoor unit and −5 °C to +40 °C indoors. According to the standard IEC (International Electro technical Commission) 61851-1 ed.3.0 [14] environmental tests are foreseen for EV conductive charging systems. The DC electric vehicle charging station shall operate at its nominal voltage with maximum output and current within the temperature range −25 • C to +40 • C for outdoor unit and −5 • C to +40 • C indoors.
Currents and voltages have been acquired using a power analyser set with a 0.5 s time step data acquisition, which was kept outside the climatic chamber due to its operating temperatures limitation.
The chargers were connected to a max 125 A, 400 Vac three phase, 50 Hz wall socket, supplied by the JRC electrical grid. The JRC grid is TN-S grounding system distributed in all the buildings at 400 Vac. The main supply of energy is at 132 kV and transformed at 11.6 kV in the electric power station where the parallel system is present with the production of the cogeneration plant. A significant number of medium/low-voltage cabins transform the electricity from 11.6 kV to 400 Vac afterwards. In addition several photovoltaic systems are installed on the roofs of buildings. This electric energy production enters directly into the energy balance of the single building. The three phases' grid section where the chargers were connected has been monitored by means of a break out box in order to measure voltages and currents for each phase. The DC section has been monitored close to the plug connection between the charger and the vehicle. Currents have been measured by means of highly accurate clamps, zero-flux method, flux-gate type for the DC side with a broad operating temperature range (−40 • C-85 • C, with DC Amplitude accuracy: ±0.3% rdg (reading), ±0.02% full scale), and a Hall effect element type for the grid section (basic accuracy of ±0.5% rdg, ±0.05% full scale, phase within ±0.2 • ).
Following SAE J2894/1 and SAE J2894/2 [15,16], the power conversion efficiency during the charging process and the energy return ratio (ERR) of the charging process have been calculated.
The recommended practice SAE J2894/1 defines the power conversion efficiency as a measure of how efficiently the charging equipment processes power from its input terminals to its output terminals. It can be measured over the total charging cycle or at any point in the charging cycle. It is a function of the design of the charger and, therefore, it can be a representative parameter for the charger [15]. It has been calculated as the ratio between the instantaneous DC power delivered to the vehicle and the instantaneous AC power supplied from the grid in order to test the performance of the charger.
The energy return ratio (ERR) represents the inverse of the system energy efficiency and defines how the system uses the energy that the charger delivers. Together with the power factor, it provides information about the electrical system impact of the vehicle. It has been calculated as the ratio between the AC energy supplied by the grid to the electric vehicle supply equipment (EVSE) and the energy delivered to the vehicle's battery.
As power conversion efficiency does not give information about how the system uses energy, in the same way the system energy efficiency could not be a controlled parameter for a charger because the charger cannot control how the system uses energy. Power conversion efficiency and the energy return ratio need to be taken into account, for the charger and the system's recharge process respectively. It is beyond the scope of this work to merge different parameters (such as ERR, power conversion efficiency, and power factor) in a single equation or factor.
Fast Charging Modes and Vehicles Seven charging columns, among those commercially available on the market, have been tested during fast-charging processes with different vehicles. All chargers were equipped with different charging options (AC and DC). The test campaign focused on DC fast charging using CCS or CHAdeMO, referred in the international standard IEC 61851-1 as Mode 4, connection of the EV to the AC supply network using an off-board charger [14].
The specifications of the chargers under test are described in Table 1. The performance in terms of efficiency, as declared by the manufacturers, ranges between 92% and 95%. The power factor values, (when stated), range from 0.96 to 0.99. The declared operating temperatures are in the range from at least −25 • C up to +40 • C except for charger G which has a specification of −20 • C as the lower limit. In some chargers (A, C, D, and E) the limit of −30 • C was the result of an optional extension to the declared limit of −10 • C, while chargers B and F come already by default with an option of −25 • C and −30 • C, respectively [17].
Historically, the CHAdeMO standard was introduced for the first time by Japanese automotive manufacturers Nissan and Mitsubishi in 2005 and then, in 2011, adopted in Europe, while the combined charging system (CCS) plug was initially developed in 2009 and then adopted by Audi, BMW, Daimler, Ford, General Motors, Porsche, Volvo, and Volkswagen in mid-2012, with specific plug forms for US and the rest of the global markets. Each of these standards operates at different DC voltages with different maximum power levels [7].
CHAdeMO charge has been tested with an electrical vehicle 24 kWh with a laminated lithium-ion high-voltage battery of 360 V (96 cells). In this case, the state of charge (SoC) of the high-voltage battery has been recorded by means of an ECU (Engine Control Unit) logger internally developed by the JRC. The charging system of the considered CHAdeMO EV has a maximum output power of 62.5 kW (voltage up to 500 V and current up to 200 A). Table 2 shows literature data in terms of power, currents, and voltages, and the time required for the recharge of PHEV (Plug-in Hybrid Electric Vehicle) and BEV with CHAdeMO. The CCS (combined charging system) vehicle used for the tests is a range extender electric vehicle (REEV) equipped with a lithium-ion battery of 33 kWh/94 Ah. The high voltage battery of the vehicle is made by lithium-ion cells with a nominal voltage of 3.68 V connected in series. Its chemistry is characterized by a mix of nickel, manganese, and cobalt for the cathode and lithium manganese oxide for the anode. The declared cell operational range of temperature is +25 • C-+40 • C, while the HV battery unit operational temperature: −40 • C-+50 • C. The state of charge has been acquired directly from the charging column under test. CCS combines EV possibilities of charging with either single-phase or three-phase AC or, alternatively, with direct-current, all in a single system. Table 3 shows the most representative technical data provided by SAE in terms of powers, currents, and voltages, and the time required for the recharge of PHEV and BEV with CCS.

Results
Due to time constraints not all combinations of temperatures and charging modes were tested. Furthermore, at ambient temperature the charging process has been fully monitored (SoC 12-80%), while at extreme temperatures the process has been recorded in the range 20-40% of SoC. Although this reduced range represents a limit to the technical discussion, the results have been considered representative for the charging process. Indeed, higher power is delivered during the first part of the recharge, when the battery is almost empty, while the final part of the recharge is characterized by a much lower rate with lower currents involved [13], with minimal power delivered and with a great reduction of the charger efficiency [15]. In support, SAE J2894/1 [15] also recommend calculating the power conversion efficiency at the rated full power output of the charger as the low-power conversion efficiency at full output may constitute a significant power loss. Table 4 illustrates this behavior. The table shows the power conversion efficiency and the power delivered by chargers A-G during charging to the EV at different SoC at ambient temperature. The time required by chargers to charge the battery from 12% to 80% of the state of charge and the related energy return ratio (ERR) are also reported at ambient temperature of around 25 • C, confirming the specifications reported in Table 2. According to the recommended practice of SAE J2894/2 [16], power conversion efficiency values of 90% and energy return ratio values of 1.10 or less are indices of "high tier" efficiency level parameters. Table 5 shows the values of power conversion efficiency and delivered power at a specific SoC at extreme temperatures (−25 • C, −15 • C, and 40 • C) for chargers A-G. Comparing Table 4 with Table 1, it can be observed that, at ambient temperature, measured data are in line with the values declared by manufacturers. Furthermore, as shown in Table 5, at extreme temperatures the conversion power efficiency drops in some cases by more than half (e.g., charger A at −15 • C), while the best tolerance in terms of efficiency stability across the entire temperature range and thus power levels is demonstrated by charger D. Its efficiency deviates at most by 18% with respect to the measured ambient condition, while half of the chargers failed to charge at this temperature. Charger C shows better efficiency values than D at −15 • C, but it fails to work at −25 • C. In terms of delivered power, as shown in Table 5, charger F appears to show a relatively better overall performance under extreme temperature variations between +40 • C and −25 • C. On the other hand, charger D establishes the best power efficiency values across the entire test temperature range.   Figures 2 and 3 show the variation of the delivered power and of the power conversion efficiency at different temperatures. These figures demonstrate that higher temperatures (+25 • C and +40 • C) do not influence the normal charging operations, however, extreme low temperatures (−15 • C and −25 • C) lead to a strong reduction of the delivered power and consequently of the power conversion efficiency. The lower the temperature is, the heavier is the impact on the process. However, some cases (chargers A, C, G) of "Out of Order" state of the charger were due to temperature-induced functionality problems.    Figures 2 and 3 show the variation of the delivered power and of the power conversion efficiency at different temperatures. These figures demonstrate that higher temperatures (+25 °C and +40 °C) do not influence the normal charging operations, however, extreme low temperatures (−15 °C and −25 °C) lead to a strong reduction of the delivered power and consequently of the power conversion efficiency. The lower the temperature is, the heavier is the impact on the process. However, some cases (chargers A, C, G) of "Out of Order" state of the charger were due to temperature-induced functionality problems.   This behavior is also laid down in the standard IEC 61851-23 [18], which states that the DC electric vehicle charging station shall be able to deliver DC power within the limits of its maximum rated power at the ambient temperature, whereas it is allowed to de-rate the power or the current outside the operating range. Anyway the standard foresees that national or industrial codes and regulations may require different operating temperature ranges.

Discussion
Once the charger is connected to the grid, current peaks occur and there is a transient time interval during which currents are higher due to initialization processes. To avoid misleading measurements, measurements have been performed half an hour after powering on the charging columns so that the charger could be considered ready for measurements.
Whenever a malfunctioning event occurred, an error message was displayed on the screen of the charger. In some cases the error was due to an interoperability issue, while in some other cases (chargers A, C, G) the issue could be related to the extreme low temperature, which puts significant stress to the reliability and quality of the electronics components due, e.g., to the temperature coefficient. As these chargers were on prototype and/or evaluation state, the manufacturers could not supply our laboratory with further information about the internal structure and details of their designs. Should it have been an issue due to the dysfunctionality of the electric vehicle itself (e.g., battery state, or on-board control module, etc.) under extreme temperature values, then the error state would have been observed across all measurement cycles, which, however, was not the case.
The location of the vehicle during the tests at extreme temperature conditions was a subject of discussion. From the chargers' manufacturer's point of view, the charger should have been characterized excluding the possible influence of the connected vehicle in order to fully explore its capabilities. Therefore, the initial proposed approach was to keep the vehicle outside the climatic chamber. However, from the customer's and investor's point of view, it is extremely crucial to know the outcome of a real case scenario.
Since the aim of the test campaign was to provide scientific evidence to these innovative scenarios, the vehicle has been tested inside the climatic chamber, together with the charging column. Hence, measurements have been affected also by the behavior of the load under charge.
Efficiency values at ambient temperature are around 90% and do not exceed 92% (Figure 3), slightly improving the fast charging efficiency of 89% reported in [8], which is lower compared to the standard AC charging efficiency evaluated around 95%. This is mainly due to energy losses with quadratic dependence on the charging current, especially losses on the battery and chokes. Power Conversion Efficiency (%)

Temperature (°C)
A B C D E F G This behavior is also laid down in the standard IEC 61851-23 [18], which states that the DC electric vehicle charging station shall be able to deliver DC power within the limits of its maximum rated power at the ambient temperature, whereas it is allowed to de-rate the power or the current outside the operating range. Anyway the standard foresees that national or industrial codes and regulations may require different operating temperature ranges.

Discussion
Once the charger is connected to the grid, current peaks occur and there is a transient time interval during which currents are higher due to initialization processes. To avoid misleading measurements, measurements have been performed half an hour after powering on the charging columns so that the charger could be considered ready for measurements.
Whenever a malfunctioning event occurred, an error message was displayed on the screen of the charger. In some cases the error was due to an interoperability issue, while in some other cases (chargers A, C, G) the issue could be related to the extreme low temperature, which puts significant stress to the reliability and quality of the electronics components due, e.g., to the temperature coefficient. As these chargers were on prototype and/or evaluation state, the manufacturers could not supply our laboratory with further information about the internal structure and details of their designs. Should it have been an issue due to the dysfunctionality of the electric vehicle itself (e.g., battery state, or on-board control module, etc.) under extreme temperature values, then the error state would have been observed across all measurement cycles, which, however, was not the case.
The location of the vehicle during the tests at extreme temperature conditions was a subject of discussion. From the chargers' manufacturer's point of view, the charger should have been characterized excluding the possible influence of the connected vehicle in order to fully explore its capabilities. Therefore, the initial proposed approach was to keep the vehicle outside the climatic chamber. However, from the customer's and investor's point of view, it is extremely crucial to know the outcome of a real case scenario.
Since the aim of the test campaign was to provide scientific evidence to these innovative scenarios, the vehicle has been tested inside the climatic chamber, together with the charging column. Hence, measurements have been affected also by the behavior of the load under charge.
Efficiency values at ambient temperature are around 90% and do not exceed 92% (Figure 3), slightly improving the fast charging efficiency of 89% reported in [8], which is lower compared to the standard AC charging efficiency evaluated around 95%. This is mainly due to energy losses with quadratic dependence on the charging current, especially losses on the battery and chokes.
Concerning the recharge of vehicles at extreme temperatures, at 40 • C the efficiency results have been confirmed with a slight reduction in the delivered power and, in certain cases, with a higher value of efficiency. In fact, lithium-ion chemistry is very susceptible to temperature variations.
At higher temperatures (>40 • C), the usable cell capacity significantly increases, as shown in Figure 4, while the internal resistance Figure 5 of the Li-ion cells decreases further. However overheating under stressful conditions, such as high ambient temperature, can cause a shortening of the lifespan of the battery due to increased degradation of the battery cell [4].
Additionally, at negative temperatures the internal resistance of the battery increases so that the charging capacity is reduced. Concerning the recharge of vehicles at extreme temperatures, at 40 °C the efficiency results have been confirmed with a slight reduction in the delivered power and, in certain cases, with a higher value of efficiency. In fact, lithium-ion chemistry is very susceptible to temperature variations.
At higher temperatures (>40 °C), the usable cell capacity significantly increases, as shown in Figure 4, while the internal resistance Figure 5 of the Li-ion cells decreases further. However overheating under stressful conditions, such as high ambient temperature, can cause a shortening of the lifespan of the battery due to increased degradation of the battery cell [4].
Additionally, at negative temperatures the internal resistance of the battery increases so that the charging capacity is reduced.  Low environmental temperatures, coupled with a high charging rate (the ratio between the value of the charging current and the nominal capacity of the battery in Ah), led to a quick degradation of the SoF (state of function) of batteries, as discussed in [6] and as shown in Figure 6.  Concerning the recharge of vehicles at extreme temperatures, at 40 °C the efficiency results have been confirmed with a slight reduction in the delivered power and, in certain cases, with a higher value of efficiency. In fact, lithium-ion chemistry is very susceptible to temperature variations.
At higher temperatures (>40 °C), the usable cell capacity significantly increases, as shown in Figure 4, while the internal resistance Figure 5 of the Li-ion cells decreases further. However overheating under stressful conditions, such as high ambient temperature, can cause a shortening of the lifespan of the battery due to increased degradation of the battery cell [4].
Additionally, at negative temperatures the internal resistance of the battery increases so that the charging capacity is reduced.  Low environmental temperatures, coupled with a high charging rate (the ratio between the value of the charging current and the nominal capacity of the battery in Ah), led to a quick degradation of the SoF (state of function) of batteries, as discussed in [6] and as shown in Figure 6. Low environmental temperatures, coupled with a high charging rate (the ratio between the value of the charging current and the nominal capacity of the battery in Ah), led to a quick degradation of the SoF (state of function) of batteries, as discussed in [6] and as shown in Figure 6. Lithium-ion accumulators have different cell chemistries, materials, and additives. These factors influence the behavior of the cell beyond its specification limits. The specification limits for charge and temperature, for example, differ for various cell types and cell chemistries and depend on the cell production process. Individual cells within the battery could be connected in series and/or in parallel. Each manufacturer follows an individual and specific design methodology which is not disclosed to external stakeholders.
In the event the cell operates outside the specified limits (in terms of temperature, voltage, and current), the reaction within the battery can quickly become uncontrolled and exothermic, leading to thermal runaway and, in unexpected cases, also to fire and explosion. Thermal runaway is an irreversible process where more heat is released than can be dissipated from the cell housing and it is, therefore, the main safety hazard in lithium-ion batteries [19]. For this reason, each vehicle is equipped with a battery management systems (BMS), an analogue and/or digital electronic device whose main task is to increase safety and reliability of battery systems, improving battery energy usage efficiency (i.e., increased driving range) and battery lifetime. BMSs of vehicles are configured with a specific threshold to limit charging rates and to prevent over charge/discharge episodes in order to actively manage the temperature of the pack, i.e., by controlling a heater to keep its minimum operating temperature, or an air or liquid cooling system to keep it below its maximum operating temperature.
CCS vehicle is equipped with a modular BMS consisting of master and slave boards, while the CHAdeMO vehicle has a centralized topology which leads to more challenges in terms of battery insulation compared to several subsystems with lower voltage levels.
At low temperatures (−15 °C/−25 °C) the consistent reduction of the delivered power and, consequently, in power conversion efficiency of the charging process occurs with both CHAdeMO and CCS vehicles regardless of the BMSs' topology and of the high voltage batteries' specific chemistry. Hence, the reason of this reduction has to be addressed at the specific strategies of battery management systems (BMS) to hold the charging process at extreme conditions rather than only at the proper functionality of the charging column.
The EVSE, in fact, has to intrinsically follow such a BMS strategy, but it is interesting that fast chargers differently show strong impacts of de-rated power onto their efficiency. This becomes Lithium-ion accumulators have different cell chemistries, materials, and additives. These factors influence the behavior of the cell beyond its specification limits. The specification limits for charge and temperature, for example, differ for various cell types and cell chemistries and depend on the cell production process. Individual cells within the battery could be connected in series and/or in parallel. Each manufacturer follows an individual and specific design methodology which is not disclosed to external stakeholders.
In the event the cell operates outside the specified limits (in terms of temperature, voltage, and current), the reaction within the battery can quickly become uncontrolled and exothermic, leading to thermal runaway and, in unexpected cases, also to fire and explosion. Thermal runaway is an irreversible process where more heat is released than can be dissipated from the cell housing and it is, therefore, the main safety hazard in lithium-ion batteries [19]. For this reason, each vehicle is equipped with a battery management systems (BMS), an analogue and/or digital electronic device whose main task is to increase safety and reliability of battery systems, improving battery energy usage efficiency (i.e., increased driving range) and battery lifetime. BMSs of vehicles are configured with a specific threshold to limit charging rates and to prevent over charge/discharge episodes in order to actively manage the temperature of the pack, i.e., by controlling a heater to keep its minimum operating temperature, or an air or liquid cooling system to keep it below its maximum operating temperature.
CCS vehicle is equipped with a modular BMS consisting of master and slave boards, while the CHAdeMO vehicle has a centralized topology which leads to more challenges in terms of battery insulation compared to several subsystems with lower voltage levels.
At low temperatures (−15 • C/−25 • C) the consistent reduction of the delivered power and, consequently, in power conversion efficiency of the charging process occurs with both CHAdeMO and CCS vehicles regardless of the BMSs' topology and of the high voltage batteries' specific chemistry. Hence, the reason of this reduction has to be addressed at the specific strategies of battery management systems (BMS) to hold the charging process at extreme conditions rather than only at the proper functionality of the charging column.
The EVSE, in fact, has to intrinsically follow such a BMS strategy, but it is interesting that fast chargers differently show strong impacts of de-rated power onto their efficiency. This becomes particularly clear when plotting the efficiency against the typical charging power applied by the BMS at each of the temperature levels (Figures 3 and 7).
Energies 2017, 10, x FOR PEER REVIEW 11 of 13 particularly clear when plotting the efficiency against the typical charging power applied by the BMS at each of the temperature levels (Figures 3 and 7). Some chargers (notably charger D) deal very well with the derating of the car's request, and still, apparently due to a modular approach of power conversion, perform comparatively high rectification and conversion efficiencies.

Conclusions
This study presents the results of the experimental activities regarding EV fast charging under extreme temperature conditions. Performances have been evaluated considering the entire process of charging rather than the single component involved (e.g., the charger or the vehicle).
Efficiency values of fast charging columns declared by manufacturers have been confirmed during the test campaign at room temperature (25 °C). However, the results presented in this paper have demonstrated that extreme low temperatures strongly impact the power level, the duration of the charging process and, consequently, the efficiency of the various chargers.
The impact of extreme temperatures on transport electrification and its interoperability has to be taken into account not only in terms of the electric driving range of vehicles, but also from the point of view of the related infrastructure. This is particularly important as Europe has different climates spreading from −45 °C up to 50 °C, depending on latitudes and seasons.
Technical managers should take into consideration what are the negative consequences for the charging infrastructure providers' business concept when recharge at low temperature happens at much lower power rates. In addition to this, business developers should take into account possibly higher harmonic distortion per power unit, higher recharge duration, and possible fault events.
All stakeholders need to be precisely informed about performances, duration and, consequently, about costs of fast charging at every possible condition. This will help lowering risk perception and proper dimensioning of the infrastructure. Manufacturer's specifications should refer to several temperature levels at which efficiency measurements were performed.
Future work is encouraged extending the current investigations related to the performance of EVSE and EV under broader environmental conditions, vehicle samples and power ratings (e.g., high power chargers). Some chargers (notably charger D) deal very well with the derating of the car's request, and still, apparently due to a modular approach of power conversion, perform comparatively high rectification and conversion efficiencies.

Conclusions
This study presents the results of the experimental activities regarding EV fast charging under extreme temperature conditions. Performances have been evaluated considering the entire process of charging rather than the single component involved (e.g., the charger or the vehicle).
Efficiency values of fast charging columns declared by manufacturers have been confirmed during the test campaign at room temperature (25 • C). However, the results presented in this paper have demonstrated that extreme low temperatures strongly impact the power level, the duration of the charging process and, consequently, the efficiency of the various chargers.
The impact of extreme temperatures on transport electrification and its interoperability has to be taken into account not only in terms of the electric driving range of vehicles, but also from the point of view of the related infrastructure. This is particularly important as Europe has different climates spreading from −45 • C up to 50 • C, depending on latitudes and seasons.
Technical managers should take into consideration what are the negative consequences for the charging infrastructure providers' business concept when recharge at low temperature happens at much lower power rates. In addition to this, business developers should take into account possibly higher harmonic distortion per power unit, higher recharge duration, and possible fault events.
All stakeholders need to be precisely informed about performances, duration and, consequently, about costs of fast charging at every possible condition. This will help lowering risk perception and proper dimensioning of the infrastructure. Manufacturer's specifications should refer to several temperature levels at which efficiency measurements were performed.
Future work is encouraged extending the current investigations related to the performance of EVSE and EV under broader environmental conditions, vehicle samples and power ratings (e.g., high power chargers).