TY - GEN
T1 - An Analysis of Car Price Prediction using Machine Learning
AU - Bhatnagar, Parth
AU - Lokesh, Gururaj Harinahalli
AU - Shreyas, J.
AU - Flammini, Francesco
AU - Gautam, Shivansh
N1 - Publisher Copyright:
© 2024 ACM.
PY - 2024/5/24
Y1 - 2024/5/24
N2 - This research paper explores machine learning techniques, such as voting regressors, gradient boosting regressors, random forest regressors, decision tree regressors, and support vector regressors, for car predicting the car price. Each machine learning technique has its own unique advantages and disadvantages, with the voting regressor exhibiting the best results. Methodologically, GridSearchCV is used to tune hyperparameters on a dataset of more than 200 automobiles, each with 26 parameters. The outcomes demonstrate the predictive power of regression and ensemble techniques, providing insightful information to practitioners in the business and academics alike. The training accuracies range from 16.87% (MAPE) for Linear Regression, 96.78% for Decision Tree Regressor, 96.49% for Random Forest Regressor, 97.84% for Gradient Boosting Regressor,95.8% for Voting Regressor, 81.89% for Support Vector Regressor, notably the testing accuracies vary from 19.44% (MAPE) for Linear Regression, 87.76% for Decision Tree Regressor, 89.75% for Random Forest Regressor, 88.67% for Gradient Boosting Regressor, 88.02% for Voting Regressor, 79.55% for Support Vector Regressor.
AB - This research paper explores machine learning techniques, such as voting regressors, gradient boosting regressors, random forest regressors, decision tree regressors, and support vector regressors, for car predicting the car price. Each machine learning technique has its own unique advantages and disadvantages, with the voting regressor exhibiting the best results. Methodologically, GridSearchCV is used to tune hyperparameters on a dataset of more than 200 automobiles, each with 26 parameters. The outcomes demonstrate the predictive power of regression and ensemble techniques, providing insightful information to practitioners in the business and academics alike. The training accuracies range from 16.87% (MAPE) for Linear Regression, 96.78% for Decision Tree Regressor, 96.49% for Random Forest Regressor, 97.84% for Gradient Boosting Regressor,95.8% for Voting Regressor, 81.89% for Support Vector Regressor, notably the testing accuracies vary from 19.44% (MAPE) for Linear Regression, 87.76% for Decision Tree Regressor, 89.75% for Random Forest Regressor, 88.67% for Gradient Boosting Regressor, 88.02% for Voting Regressor, 79.55% for Support Vector Regressor.
UR - https://www.scopus.com/pages/publications/85204676235
UR - https://www.scopus.com/pages/publications/85204676235#tab=citedBy
U2 - 10.1145/3674029.3674032
DO - 10.1145/3674029.3674032
M3 - Conference contribution
AN - SCOPUS:85204676235
T3 - ACM International Conference Proceeding Series
SP - 11
EP - 15
BT - Proceedings of the 2024 9th International Conference on Machine Learning Technologies, ICMLT 2024
PB - Association for Computing Machinery
T2 - 9th International Conference on Machine Learning Technologies, ICMLT 2024
Y2 - 24 May 2024 through 26 May 2024
ER -