# Towards Modified Entropy Mutual Information Feature Selection to Forecast Medium-Term Load Using a Deep Learning Model in Smart Homes

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

- We propose an entropy MI-based FS method that can handle both linear and nonlinear electricity load data; however, we improve the work of [21] to remove irrelevancy and redundancy of features. We also demonstrate how a modified entropy MI is applied to work systematically on load time series.
- We propose auxiliary variables for the FS method based on four joint discrete random variables. Furthermore, the efficiency of the selected candidate features is examined based on ranking.
- An accurate and robust MTLF (AR-MTLF) based on condition restricted Boltzmann machine (CRBM) is proposed to forecast month ahead hourly electrical loads. We refer to “robust” in our work to imply the efficiency of our proposed model in terms of execution time and the dynamic analysis of consumers’ behaviors. In addition, Jaya-based meta-heuristic optimization algorithm is used to improve the forecasting accuracy.
- We analyze the consumers’ energy consumption behaviors by adopting a discrete time Markov chain (DTMC) that determines the state-dependent features. Furthermore, adaptive k-means [22] is used to classify electricity load into five groups, e.g., low, average, high, and extremely high consumption. In addition, it also derives the number of transitions and the obtained values serve as the quantization level of the load consumption.
- The proposed model was implemented using the dataset of GEFCom2012 US utility [23]. In addition, we compared AR-MTLF model with accurate fast converging STLF (AFC-STLF) [21], artificial neural network (ANN), naive Bayes (NB), k-nearest neighbor (KNN), support vector regression (SVR), and ensemble models.

## 2. Related Work

## 3. Problem Statement

## 4. System Model

#### 4.1. Data Preparation and Preprocessing

#### 4.2. Modified MI Based FS

#### 4.3. Forecaster Module

#### 4.4. Optimizer Module

#### 4.5. Customers’ Dynamic Behavior

Algorithm 1 Adaptive k-means. | |

1: procedureadaptive k-means(actual dataset) | |

2: $i\leftarrow 0;,j\leftarrow 0;$ | ▹ initialize iteration counter |

3: $data\leftarrow double(\mathtt{actualdataset});$ | |

4: $index\leftarrow data(:);$ | ▹ copy value as array |

5: while $true$ do | |

6: ${M}_{1}\leftarrow mean(data)$ | ▹ initialize the mean point |

7: $i\leftarrow i+1;$ | ▹ increment counter for each iteration |

8: while $true$ do | |

9: $j\leftarrow j+1;$ | ▹ increment counter for each iteration |

10: $ds\leftarrow {(index-{M}_{1})}^{2};$ | ▹ find the distance between index and data |

11: $d{s}_{1}\leftarrow (sum{(index-{M}_{1})}^{2}/N);$ | ▹ N is the number of dataset |

12: $best\leftarrow ds-d{s}_{1};$ | ▹ check whether it is selected accurately |

13: ${M}_{1}^{new}\leftarrow mean({M}_{1}(best));$ | ▹ update mean point |

14: if ${M}_{1}^{new}$ then | |

15: break; | |

16: end if | |

17: if ${M}_{1}=={M}_{1}^{new}\phantom{\rule{0.277778em}{0ex}}|\phantom{\rule{0.277778em}{0ex}}j>\beta $ then | ▹ $\beta $ is distortion threshold |

18: $j\leftarrow 0;$ | |

19: $index(best)\leftarrow \left[\phantom{\rule{0.277778em}{0ex}}\right];$ | ▹ remove value that is already assigned to a cluster |

20: $Center(i)\leftarrow {M}_{1}^{new}$ | ▹ store center of cluster |

21: break; | |

22: end if | |

23: ${M}_{1}\leftarrow {M}_{1}^{new}$ | ▹ update mean point |

24: end while | |

25: if $index==0\phantom{\rule{0.277778em}{0ex}}|\phantom{\rule{0.277778em}{0ex}}i>\beta $ then | ▹ check maximum number of cluster |

26: $i\leftarrow 0;$ | |

27: break; | |

28: end if | |

29: end while | |

30: $Center\leftarrow sort(Center);$ | ▹ sort center |

31: $Cente{r}^{new}\leftarrow diff(Center);$ | ▹ find the differences between two centers |

32: $Center(Cente{r}^{new}<=intercluster)=\left[\phantom{\rule{0.277778em}{0ex}}\right];$ | ▹ ignore cluster center less than distance |

33: $distance\leftarrow data-Center;$ | ▹ find distance between cluster and data |

34: $[\phantom{\rule{3.33333pt}{0ex}},indx]\leftarrow min(distance)$ | ▹ choose cluster index of minimum distance |

35: return $indx,Center$ | |

36:
end procedure |

- Irreducible: MC is irreducible if all states interact with one another.
- Periodicity: Any visit to state x with period d occurs k number of times.$$\begin{array}{c}\hfill k=gcd\{n>0:pr({Y}_{n}=x\phantom{\rule{0.277778em}{0ex}}|\phantom{\rule{0.277778em}{0ex}}{Y}_{0}=x)>0\},\end{array}$$
- Transient: A state is transient if we start in state x and there is a nonzero probability that we return to x.
- Recurrence: If the number of visit to a state is infinite, then recurrence state has occurred.
- Absorbing: If one stays in a state, then it is impossible to leave the state; hence, an absorbing state has occurred. Thus, a state x is absorbing if $p{r}_{x,x}=1$ and $p{r}_{x,y}=0$, $x\ne y$.

## 5. Simulations and Discussion

#### 5.1. Hourly Load Forecasting

#### 5.2. Load Forecasting Based on Seasons

#### 5.3. Proposed Model’s Performance Evaluations

#### 5.4. Operational Forecast

#### 5.5. Consumers Consumption Dynamic

## 6. Conclusions

## Author Contributions

## Acknowledgments

## Conflicts of Interest

## Abbreviations

ANN | Artificial neural network |

AFC-STLF | Accurate fast converging short-term load forecasting |

AG | Antigen |

AIS | Artificial immune system |

AR-MTLF | An accurate and robust medium-term load forecasting |

CRBM | Condition restricted Boltzmann machine |

DFS | Date-framework strategy |

DTMC | Discrete time Markov chain |

DWT-IR | Discrete wavelet transform and inconsistency rate |

EMD | Empirical mode decomposition |

ELM | Extreme learning machine |

FCRMB | Factor CRBM |

FPR | Fuzzy polynomial regression |

FS | Feature selection |

GA | Genetic algorithm |

GABICS | GA binary improved cuckoo search |

IENN | Improved Elman neural network |

KNN | K-nearest neighbors |

LR | Linear regression |

LSSVM | Least square support vector machine |

LSTM | Long short term memory |

LTLF | Long-term load forecasting |

mEDE | Modified enhanced differential evolution algorithm |

MI | Mutual information |

MRMRPC | Maximizes the relevancy and minimizes the redundancy on Pearson’s correlation |

MRMRMS | Maximizes relevancy and minimizes redundancy and maximizes synergy of candidate features |

MTLF | Medium-term load forecasting |

MTR | Model tree rules |

NB | Naive Bayes |

OS-ELM | online sequence ELM |

PRNN | Pyramid system and recurrent neural networks |

RMSE | Root mean square error |

SWA | Sperm whale algorithm |

SVR | Support vector regression |

STLF | Short-term load forecasting |

STPF | Short-term price forecasting |

SWEMD | Sliding window EMD |

VSTLF | Very short-term load forecasting |

WLSSVM | Wavelet least square support vector machine |

## References

- Ahmad, T.; Chen, H. Short and medium-term forecasting of cooling and heating load demand in building environment with data-mining based approaches. Energy Build.
**2018**, 166, 460–476. [Google Scholar] [CrossRef] - Han, L.; Peng, Y.; Li, Y.; Yong, B.; Zhou, Q.; Shu, L. Enhanced Deep Networks for Short-Term and Medium-Term Load Forecasting. IEEE Access
**2019**, 7, 4045–4055. [Google Scholar] [CrossRef] - Ahmad, T.; Chen, H. Potential of three variant machine-learning models for forecasting district level medium-term and long-term energy demand in smart grid environment. Energy
**2018**, 160, 1008–1020. [Google Scholar] [CrossRef] - De Oliveira, E.M.; Oliveira, F.L.C. Forecasting mid-long term electric energy consumption through bagging ARIMA and exponential smoothing methods. Energy
**2018**, 144, 776–788. [Google Scholar] [CrossRef] - Abu-Shikhah, N.; Elkarmi, F.; Aloquili, O.M. Medium-term electric load forecasting using multivariable linear and non-linear regression. Smart Grid Renew. Energy
**2011**, 2, 126–135. [Google Scholar] [CrossRef][Green Version] - Amjady, N.; Keynia, F. Mid-term load forecasting of power systems by a new forecasting method. Energy Convers. Manag.
**2008**, 49, 2678–2687. [Google Scholar] [CrossRef] - Alamaniotis, M.; Bargiotas, D.; Tsoukalas, L.H. Towards smart energy systems: Application of kernel machine regression for medium term electricity load forecasting. SpringerPlus
**2016**, 5, 58. [Google Scholar] [CrossRef][Green Version] - Hu, Z.; Bao, Y.; Xiong, T.; Chiong, R. Hybrid filter-wrapper feature selection for short-term load forecasting. Eng. Appl. Artif. Intell.
**2015**, 40, 17–27. [Google Scholar] [CrossRef] - Wi, Y.-M.; Joo, S.-K.; Song, K.-B. Holiday load forecasting using fuzzy polynomial regression with weather feature selection and adjustment. IEEE Trans. Power Syst.
**2012**, 27, 596. [Google Scholar] [CrossRef] - Jiang, P.; Liu, F.; Song, Y. A hybrid forecasting model based on date-framework strategy and improved feature selection technology for short-term load forecasting. Energy
**2017**, 119, 694–709. [Google Scholar] [CrossRef] - Dudek, G. Artificial immune system with local feature selection for short-term load forecasting. IEEE Trans. Evol. Comput.
**2017**, 21, 116–130. [Google Scholar] [CrossRef] - Liu, J.; Li, C. The short-term power load forecasting based on sperm whale algorithm and wavelet least square support vector machine with DWT-IR for feature selection. Sustainability
**2017**, 9, 1188. [Google Scholar] [CrossRef][Green Version] - Dong, Y.; Wang, J.; Wang, C.; Guo, Z. Research and Application of Hybrid Forecasting Model Based on an Optimal Feature Selection System—A Case Study on Electrical Load Forecasting. Energies
**2017**, 10, 490. [Google Scholar] [CrossRef][Green Version] - Liu, Y.; Wang, W.; Ghadimi, N. Electricity load forecasting by an improved forecast engine for building level consumers. Energy
**2017**, 139, 18–30. [Google Scholar] [CrossRef] - Yang, L.; Yang, H.; Yang, H.; Liu, H. GMDH-Based Semi-Supervised Feature Selection for Electricity Load Classification Forecasting. Sustainability
**2018**, 10, 217. [Google Scholar] [CrossRef][Green Version] - Bouktif, S.; Fiaz, A.; Ouni, A.; Serhani, M. Optimal Deep Learning LSTM Model for Electric Load Forecasting using Feature Selection and Genetic Algorithm: Comparison with Machine Learning Approaches. Energies
**2018**, 11, 1636. [Google Scholar] [CrossRef][Green Version] - Koprinska, I.; Rana, M.; Agelidis, V.G. Correlation and instance based feature selection for electricity load forecasting. Knowl.-Based Syst.
**2015**, 82, 29–40. [Google Scholar] [CrossRef] - Fallah, S.N.; Deo, R.C.; Shojafar, M.; Conti, M.; Shamshirband, S. Computational Intelligence Approaches for Energy Load Forecasting in Smart Energy Management Grids: State of the Art, Future Challenges, and Research Directions. Energies
**2018**, 11, 596. [Google Scholar] [CrossRef][Green Version] - Li, Y.; Guo, P.; Li, X. Short-term load forecasting based on the analysis of user electricity behavior. Algorithms
**2016**, 9, 80. [Google Scholar] [CrossRef][Green Version] - Omaji, S.; Javaid, N.; Asma, R. A new entropy-based feature selection method for load forecasting in smart homes. In Proceedings of the International Conference on Cyber Security and Computer Science (ICONCS), Karabük, Turkey, 18–20 October 2018; pp. 185–192. [Google Scholar]
- Ahmad, A.; Javaid, N.; Guizani, M.; Alrajeh, N.; Khan, Z.A. An accurate and fast converging short-term load forecasting model for industrial applications in a smart grid. IEEE Trans. Ind. Inform.
**2017**, 13, 2587–2596. [Google Scholar] [CrossRef] - Al-Jarrah, O.Y.; Al-Hammadi, Y.; Yoo, P.D.; Muhaidat, S. Multi-layered clustering for power consumption profiling in smart grids. IEEE Access
**2017**, 5, 18459–18468. [Google Scholar] [CrossRef] - Wang, P.; Liu, B.; Hong, T. Electric load forecasting with recency effect: A big data approach. Int. J. Forecast.
**2016**, 32, 585–597. [Google Scholar] [CrossRef][Green Version] - Abedinia, O.; Amjady, N.; Zareipour, H. A new feature selection technique for load and price forecast of electrical power systems. IEEE Trans. Power Syst.
**2017**, 32, 62–74. [Google Scholar] [CrossRef] - Johannesen, N.J.; Mohan, K.; Morten, G. Relative evaluation of regression tools for urban area electrical energy demand forecasting. J. Clean. Prod.
**2019**, 218, 555–564. [Google Scholar] [CrossRef] - Amara, F.; Agbossou, K.; Dubé, Y.; Kelouwani, S.; Cardenas, A.; Hosseini, S.S. A residual load modeling approach for household short-term load forecasting application. Energy Build.
**2019**, 187, 132–143. [Google Scholar] [CrossRef] - Kuo, P.; Huang, C. A high precision artificial neural networks model for short-term energy load forecasting. Energies
**2018**, 11, 213. [Google Scholar] [CrossRef][Green Version] - Gaillard, P.; Goude, Y.; Nedellec, R. Additive models and robust aggregation for GEFCom2014 probabilistic electric load and electricity price forecasting. Int. J. Forecast.
**2016**, 32, 1038–1050. [Google Scholar] [CrossRef] - Mocanu, E.; Nguyen, P.H.; Gibescu, M.; Kling, W.L. Deep learning for estimating building energy consumption. Sustain. Energy Grids Netw.
**2016**, 6, 91–99. [Google Scholar] [CrossRef] - Quilumba, F.L.; Lee, W.; Huang, H.; Wang, D.Y.; Szabados, R.L. Using smart meter data to improve the accuracy of intraday load forecasting considering customer behavior similarities. IEEE Trans. Smart Grid
**2015**, 6, 911–918. [Google Scholar] [CrossRef] - Singh, S.; Yassine, A. Big data mining of energy time series for behavioral analytics and energy consumption forecasting. Energies
**2018**, 11, 452. [Google Scholar] [CrossRef][Green Version] - Hsiao, Y.-H. Household Electricity Demand Forecast Based on Context Information and User Daily Schedule Analysis From Meter Data. IEEE Trans. Ind. Inform.
**2015**, 11, 33–43. [Google Scholar] [CrossRef] - Zhang, P.; Wu, X.; Wang, X.; Bi, S. Short-term load forecasting based on big data technologies. CSEE J. Power Energy Syst.
**2015**, 1, 59–67. [Google Scholar] [CrossRef] - Liu, N.; Tang, Q.; Zhang, J.; Fan, W.; Liu, J. A hybrid forecasting model with parameter optimization for short-term load forecasting of micro-grids. Appl. Energy
**2014**, 129, 336–345. [Google Scholar] [CrossRef] - Amjady, N.; Keynia, F. Day-ahead price forecasting of electricity markets by mutual information technique and cascaded neuro-evolutionary algorithm. IEEE Trans. Power Syst.
**2009**, 24, 306–318. [Google Scholar] [CrossRef] - Amjady, N.; Keynia, F.; Zareipour, H. Short-term load forecast of microgrids by a new bilevel forecasting strategy. IEEE Trans. Smart Grid
**2010**, 1, 286–294. [Google Scholar] [CrossRef] - Chandrashekar, G.; Sahin, F. A survey on feature selection methods. Comput. Electr. Eng.
**2014**, 40, 16–28. [Google Scholar] [CrossRef] - Rao, R. Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Ind. Eng. Comput.
**2016**, 7, 19–34. [Google Scholar] - Samuel, O.; Javaid, N.; Ashraf, M.; Ishmanov, F.; Afzal, M.; Khan, Z. Jaya based Optimization Method with High Dispatchable Distributed Generation for Residential Microgrid. Energies
**2018**, 11, 1513. [Google Scholar] [CrossRef][Green Version] - Wang, Y.; Chen, Q.; Kang, C.; Xia, Q. Clustering of electricity consumption behavior dynamics toward big data applications. IEEE Trans. Smart Grid
**2016**, 7, 2437–2447. [Google Scholar] [CrossRef] - Adaptive Kmeans Clustering for Color and Gray Image. Available online: https://www.mathworks.com/matlabcentral/fileexchange/45057-adaptive-kmeans-clustering-for-color-and-gray-image (accessed on 10 October 2019).
- Sigauke, C.; Chikobvu, D. Estimation of extreme inter-day changes to peak electricity demand using Markov chain analysis: A comparative analysis with extreme value theory. J. Energy S. Afr.
**2017**, 28, 68–76. [Google Scholar] [CrossRef] - Jaskowiak, P.A.; JGB Campello, R.; Costa, I.G. On the selection of appropriate distances for gene expression data clustering. BMC Bioinform.
**2014**, 15, S2. [Google Scholar] [CrossRef] [PubMed][Green Version] - Mercioni, M.A.; Holban, S. A Survey of Distance Metrics in Clustering Data Mining Techniques. In Proceedings of the 2019 3rd International Conference on Graphics and Signal Processing, Hong Kong, China, 1–3 June 2019; pp. 44–47. [Google Scholar]
- Bora, D.; Jyoti, G.; Anil, K. Effect of different distance measures on the performance of K-means algorithm: an experimental study in Matlab. arXiv
**2014**, arXiv:1405.7471. [Google Scholar] - Luo, J.; Hong, T.; Fang, S. Benchmarking robustness of load forecasting models under data integrity attacks. Int. J. Forecast.
**2018**, 34, 89–104. [Google Scholar] [CrossRef] - Luo, J.; Hong, T.; Fang, S. Robust regression models for load forecasting. IEEE Trans. Smart Grid
**2018**, 10, 5397–5404. [Google Scholar] [CrossRef] - Khalid, A.; Javaid, N.; Mateen, A.; Ilahi, M.; Saba, T.; Rehman, A. Enhanced time-of-use electricity price rate using game theory. Electronics
**2019**, 8, 48. [Google Scholar] [CrossRef][Green Version]

**Figure 12.**Comparison of forecasting accuracy with existing models for test data of 2007 (${Z}_{1}$).

**Figure 15.**Heat map reporting the RMSE values. White color indicates RMSE range of 0–40; light green color indicates RMSE range of 41–200; dark green color indicates RMSE of 201–500.

**Figure 16.**Seven days of consecutive MTLF distribution of load from January to December 2007 using the actual temperature value. The gray plot indicates the confident interval (CI) of 95%.

**Figure 17.**Thirty days of consecutive MTLF distribution of load from January to December 2007 using the average distribution of electrical load and temperature values. The gray plot indicates the confident interval (CI) of 95%.

S/N | Size of Dataset (year) | Time Resolution (h) | FS | Techniques | Objective(s) | References |
---|---|---|---|---|---|---|

1. | 4.5 | 1 | ✓ | SVR | STLF | [8] |

2. | 3 | 2 | ✓ | FPR | STLF | [9] |

3. | 1 | 3 min | ✓ | GABICS-DFS-ELM | STLF | [10] |

4. | 2 | 1 | ✓ | AIS | STLF | [11] |

5. | 1 | 1 | ✓ | WLSSVM-SWA | STLF | [12] |

6. | 1 | 30 min | ✓ | PRNN | STLF | [13] |

7. | 1 | 1 | ✓ | IENN | STLF | [14] |

8. | 2 | 30 min | ✓ | LSTM-RNN | STLF | [16] |

9. | 2 | 5 min | ✓ | NN, LR and MTR | VTLF | [17] |

10. | 4 | 24 | ✓ | MRMRMS | STLF and STPF | [24] |

11. | 4 | 15 min | - | CRBM and FCRBM | STLF | [29] |

12. | 1.5 | 30 min | - | k-means | LTLF | [30] |

13. | 4 | 6 | - | Clustering technique and Bayesian network | STLF + LTLF | [31] |

14. | 1 | 30 min | - | OS-ELM | STLF | [19] |

15. | 2 | 15 min | ✓ | Inter cluster technique | VSTLF | [32] |

16. | 1 | 1 | - | Cluster analysis, association analysis and decision tree | STLF | [33] |

17. | 4.5 | 1 | ✓ | CRBM, adaptive k-means and DTMC | MTLF | Proposed scheme |

Parameter | Value |
---|---|

Population size (Jaya) | 24 |

Number of decision variables (Jaya) | 2 |

Maximum iterations | 100 |

Maximum bound (Jaya) | 0.9 |

Minimum bound (Jaya) | 0.1 |

Number hidden layers (CRBM) | 10 |

Learning rate (CRBM) | 0.001 |

Weight decay (CRBM) | 0.0002 |

Momentum (CRBM) | 0.5 |

$\mathit{x}/\mathit{i}$ | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|

1 | 0.1458 | 0.1458 | 0.2500 | 0.0833 | 0.3750 |

2 | 0.0517 | 0.2931 | 0.0862 | 0.3793 | 0.1897 |

3 | 0.1647 | 0.2824 | 0.1059 | 0.2824 | 0.1647 |

4 | 0.0877 | 0.0877 | 0.4561 | 0.2895 | 0.0789 |

5 | 0.2203 | 0.0000 | 0.1356 | 0.5254 | 0.1186 |

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

0.0 | 0.0 | 0.0 | 0.0 | 0.91 | 0.91 | 0.91 | 0.91 | 0.0 | 0.0 | 0.0 | 0.0 | 0.91 | 0.91 | 0.91 | 0.91 |

**Table 5.**Joint probability of the individual value of FS. ${F}_{1}$ denotes the historical data, while ${F}_{2}$, ${F}_{3}$, and ${F}_{4}$ denote the target value, mean, and moving average value, respectively.

Binary | ${\mathit{F}}_{1}$ | ${\mathit{F}}_{2}$ | ${\mathit{F}}_{3}$ | ${\mathit{F}}_{4}$ |
---|---|---|---|---|

0 | 0.6667 | 0.0000 | 0.4986 | 0.5002 |

1 | 0.6667 | 1.0000 | 0.5014 | 0.4998 |

Z | AR-MTLF | AFC-STLF | SVR | NB | Ensemble | KNN | ANN |
---|---|---|---|---|---|---|---|

1 | 0.32 | 0.42 | 8.22 | 9.01 | 8.92 | 0.98 | 5.81 |

2 | 0.35 | 11.40 | 1.03 | 5.52 | 5.68 | 11.87 | 10.12 |

3 | 0.40 | 13.30 | 1.52 | 5.88 | 5.92 | 13.02 | 11.54 |

4 | 0.31 | 0.25 | 0.58 | 0.48 | 2.45 | 0.30 | 3.18 |

5 | 0.40 | 0.58 | 1.10 | 0.83 | 0.78 | 0.72 | 1.53 |

6 | 0.51 | 11.81 | 1.30 | 7.02 | 6.37 | 11.0 | 9.92 |

Z | AR-MTLF | AFC-STLF | SVR | NB | Ensemble | KNN | ANN |
---|---|---|---|---|---|---|---|

1 | 100.00 | 125.50 | 37.52 | 39.20 | 38.10 | 38.35 | 30.21 |

2 | 142.01 | 146.76 | 36.21 | 39.27 | 39.32 | 37.22 | 35.90 |

3 | 161.32 | 172.34 | 35.89 | 38.23 | 38.322 | 35.22 | 30.21 |

4 | 158.23 | 170.39 | 42.91 | 43.82 | 42.49 | 40.21 | 35.32 |

5 | 178.39 | 184.87 | 38.21 | 39.88 | 40.44 | 37.92 | 30.77 |

6 | 150.45 | 160.76 | 39.87 | 40.32 | 40.88 | 38.22 | 30.55 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Samuel, O.; Alzahrani, F.A.; Hussen Khan, R.J.U.; Farooq, H.; Shafiq, M.; Afzal, M.K.; Javaid, N. Towards Modified Entropy Mutual Information Feature Selection to Forecast Medium-Term Load Using a Deep Learning Model in Smart Homes. *Entropy* **2020**, *22*, 68.
https://doi.org/10.3390/e22010068

**AMA Style**

Samuel O, Alzahrani FA, Hussen Khan RJU, Farooq H, Shafiq M, Afzal MK, Javaid N. Towards Modified Entropy Mutual Information Feature Selection to Forecast Medium-Term Load Using a Deep Learning Model in Smart Homes. *Entropy*. 2020; 22(1):68.
https://doi.org/10.3390/e22010068

**Chicago/Turabian Style**

Samuel, Omaji, Fahad A. Alzahrani, Raja Jalees Ul Hussen Khan, Hassan Farooq, Muhammad Shafiq, Muhammad Khalil Afzal, and Nadeem Javaid. 2020. "Towards Modified Entropy Mutual Information Feature Selection to Forecast Medium-Term Load Using a Deep Learning Model in Smart Homes" *Entropy* 22, no. 1: 68.
https://doi.org/10.3390/e22010068