Researcher in Computer Science and Engineering
PhD in Computer Engineering from
Department of Computer Engineering, Koç University.
English Word of the Day
January 6, 2024
Potential for Improvement of Sentence Translations
Instance Weighting in Neural Networks for Click-Through Rate Prediction
Neural Network Calibration for CTR Prediction
Efficiently Sampling in Neural Network Training for Click-Through Rate Prediction
Finding efficient downsampling techniques has become more crucial as the training datasets for advertisement click-through rate (CTR) prediction models are growing to billions in size. We present efficient downsampling to sample CTR datasets with goals of faster training and limited decrease in the performance. We present encouraging results demonstrating the effectiveness of our approach on two publicly available CTR prediction datasets and compare efficient downsampling with stratified random downsampling.
Residual Fusion Models with Neural Networks for CTR Prediction
Ergun Biçici. Residual Fusion Models with Neural Networks for CTR Prediction. 2023 8th International Conference on Computer Science and Engineering (UBMK), Burdur, Turkiye, 2023, pp. 01-04, doi: 10.1109/UBMK59864.2023.10286706. URL: https://ieeexplore.ieee.org/document/10286706
No single prediction model achieves the best performance on all datasets and we are better off combining the strengths of different models for each task. Residual fusion learning is a two step combination method that trains a second model on the residual of the target from the first model's prediction. In the final phase, the predictions of both of these models are added. We use gradient boosting decision trees (GBDT) and neural networks for the initial model in residual fusion and compare three GBDT models and four neural network models. We introduce residual fusion with two different neural network models and show that we can achieve AUC gains that reach 0.95%.