TY - JOUR
T1 - Chebyshev polynomial approximation in CNN for zero-knowledge encrypted data analysis
AU - Ghanimi, Hayder M.A.
AU - Gopalakrishnan, T.
AU - Joel Sunny Deol, G.
AU - Amarendra, K.
AU - Dadheech, Pankaj
AU - Sengan, Sudhakar
N1 - Publisher Copyright:
© TARU PUBLICATIONS.
PY - 2024/3
Y1 - 2024/3
N2 - Integrating Deep Learning (DL) techniques in Convolutional Neural Networks (CNNs) with encrypted data analysis is an emerging field for enhancing data privacy and security. A significant challenge in this domain is the incompatibility of standard non-linear Activation Functions (AF) like Rectified Linear Unit (ReLU) and Hyperbolic Tangent (tanh) with Zero-Knowledge (ZK) encrypted data, which impacts computational efficiency and data privacy. Addressing this, our paper introduces the novel application of Chebyshev Polynomial Approximation (CPA) to adapt these AF to process encrypted data effectively. Utilizing the MNIST dataset, this paper conducted experiments with LeNet and various configurations of AlexNet, extending the range of the ReLU and tanH functions to optimize CPA. Our results reveal an optimal polynomial degree (α), with α = 10 for ReLU and between α = 10 and α = 15 for tanH, beyond which the benefits in accuracy plateau. This finding is crucial for ensuring the accuracy and efficiency of CNNs in processing encrypted data. This recommended study demonstrates that while the accuracy slightly decreases for plaintext data and more significantly for ciphertext data, the overall effectiveness of CPA in CNNs is maintained. This advancement enables CNNs to process encrypted data while preserving privacy and marks a significant step in developing privacy-preserving Machine Learning (ML) and encrypted data analysis.
AB - Integrating Deep Learning (DL) techniques in Convolutional Neural Networks (CNNs) with encrypted data analysis is an emerging field for enhancing data privacy and security. A significant challenge in this domain is the incompatibility of standard non-linear Activation Functions (AF) like Rectified Linear Unit (ReLU) and Hyperbolic Tangent (tanh) with Zero-Knowledge (ZK) encrypted data, which impacts computational efficiency and data privacy. Addressing this, our paper introduces the novel application of Chebyshev Polynomial Approximation (CPA) to adapt these AF to process encrypted data effectively. Utilizing the MNIST dataset, this paper conducted experiments with LeNet and various configurations of AlexNet, extending the range of the ReLU and tanH functions to optimize CPA. Our results reveal an optimal polynomial degree (α), with α = 10 for ReLU and between α = 10 and α = 15 for tanH, beyond which the benefits in accuracy plateau. This finding is crucial for ensuring the accuracy and efficiency of CNNs in processing encrypted data. This recommended study demonstrates that while the accuracy slightly decreases for plaintext data and more significantly for ciphertext data, the overall effectiveness of CPA in CNNs is maintained. This advancement enables CNNs to process encrypted data while preserving privacy and marks a significant step in developing privacy-preserving Machine Learning (ML) and encrypted data analysis.
UR - https://www.scopus.com/pages/publications/85191862158
UR - https://www.scopus.com/pages/publications/85191862158#tab=citedBy
U2 - 10.47974/JDMSC-1880
DO - 10.47974/JDMSC-1880
M3 - Article
AN - SCOPUS:85191862158
SN - 0972-0529
VL - 27
SP - 203
EP - 214
JO - Journal of Discrete Mathematical Sciences and Cryptography
JF - Journal of Discrete Mathematical Sciences and Cryptography
IS - 2
ER -