Estimate the maximum scour depth as a function of variables including flow depth and mean velocity, size of pier, median grain size and skew by using recorded weights via training data
This project is actually a research-based project whose data-set is extracted from a book called "Field Observations and Evaluations of Streambed Scour at Bridges" which was published in MAY 2005 for the research purpose.
This dataset contains the features like velocity, depth, pier-type,pier-width,pier-length ,etc. as the independent features on which the dependent feature i.e.the scour depth around pier depends and some factors like skew ,bed-material and Debris effects on which it varies.
Basically,there are three cases that we need to consider in terms of velocity for estimation of scour depth around piers and bridges namely: the current-only , wave-only and both waves and current case.Here, the dataset is limited to current-only case.
It begins with data preprcessing, removal of anonymous content of data and execution of some irrelevant or unrelated columns with using correlation-matrix which is then followed by training and testing the supervised machine-learning regression algorithms like Gradient Boost, XGBoost , Random Forest and LightGBM models using scour data available in the literature.
However, we manually tuned the parameters of models instead of grid-search.
So, here we use the functionality of feature-importance values of all the independent variables in order to seize the data and straight-forward the input data.
Finally, we develop a new model using those top 5 important features as an input and imporve it's accuracy using ensembling model of Voting Regressor.