I work at Huawei Technologies as a research scientist now in Beijing.

I am now working on language identification, machine translation and LLM Agent research. If you are seeking any form of employment opportunity, please feel free to email me at dingjianbang1@huawei.com. We are also hiring interns!

I graduated from School of Intelligence Science and Technology, Peking University (北京大学智能学院) with a master’s degree, supervised by Prof. Zhihong Deng (邓志鸿) and from Qian Xuesen Class, Xidian University (西安电子科技大学钱学森班) with a bachelor’s degree. I also collaborate with Prof. Xu Sun (孙栩) closely.

I won the CCL 2023 Best Poster Award and the Huawei Top-10 Patents Nomination Award (华为终端BG十大发明提名). My work was selected by the Xi’an Municipal Government for exhibition at the 2017 Silk Road International Expo (2017丝绸之路国际博览会). I have published several papers at the premier international AI conferences such as IJCNN, NLPCC and CCL.

🔥 News

  • 2024.08: AdaMod optimizer has garnered over 50+ citations and 120+ stars
  • 2024.03: One paper is accepted by IJCNN 2024
  • 2023.08: 🎉 Win the CCL 2023 best poster award
  • 2023.07: One paper is accepted by NLPCC 2023
  • 2023.06: One paper is accepted by CCL 2023

📝 Publications

arXiv Preprint
sym

An Adaptive and Momental Bound Method for Stochastic Learning
Jianbang Ding, Xuancheng Ren, Ruixuan Luo, Xu Sun
[Paper] [Code] [Video]

We propose AdaMod , an improvement over the Adam optimizer by adding in a long-term memory aspect.

Just pip install adamod to try it! Up to now, it has multiple several variants and implementations, supporting both Pytorch, Tensorflow, Keras.

NLPCC 2023
sym

An Adaptive Learning Method for Solving the Extreme Learning Rate Problem of Transformer
Jianbang Ding, Xuancheng Ren, Ruixuan Luo
[Paper] [Code] [Video]

Conducting more empirical studies on AdaMod.

Some third-party’s Comments:

  • In testing AdaMod on some datasets along with other optimizers, I find that AdaMod is consistently a top 5 optimizer.” ——Less Wright
  • I’ve had great success with this wrapped in lookahead.” ——Evan Walters
CCL 2023
sym

Adder Encoder for Pre-trained Language Model
Jianbang Ding, Suiyun Zhang, Linlin Li
[Paper] [Poster]

🎉CCL Best Poster Award
AddderBERT achieves highly competitive performance against that of BERT-base on the GLUE benchmark while obtaining a 4.9x reduction in energy consumption.

IJCNN 2024
sym

Disfluency Detection for Real-World Scenarios
Jianbang Ding, Suiyun Zhang, Dandan Tu
[Paper] [Slide] [Video]

Oral Paper
Our approach significantly outperforms previous baselines and achieves state-of-the-art performance (94.3 F-score) on English Switchboard corpus.

🎖 Honors and Awards

  • 2023.12 Huawei Top-10 Patents Nomination Award
  • 2023.08 CCL 2023 Best Poster Award
  • 2021.06 Outstanding Graduates of Peking University
  • 2018.06 Outstanding Graduates of Xidian University
  • 2017.09 National Scholarship (Top 1%)

📖 Educations

  • 2018.09 - 2021.07, Master, Peking University, Beijing.
  • 2014.09 - 2018.07, Bachelor, Xidian University, Xi’an.
  • 2010.09 - 2013.07, Northeast Yucai School, Shenyang

💻 Experiences

  • 2021.08 - Now, research scientist, Celia Dept, Huawei, Beijing.
  • 2020.07 - 2020.09, research intern, Noah’s Ark Lab, Huawei, Beijing.
  • 2017.11 - 2018.05, research intern, Qian Xuesen Lab, China Academy of Space Technology, Beijing