Towards Efficient Fine-Tuning of Pre-trained Code Models: An
One Adapter for All Programming Languages? Adapter Tuning fo
引用Martin Weyssow, Xin Zhou, Kisub Kim, David Lo, and Houari
论文地址:2206.00052 (arxiv.org)代码:https://github.com/reddy-lab-c
On Extracting Specialized Code Abilities from Large Language
Multi-target Backdoor Attacks for Code Pre-trained Models Y
Compressing Pre-trained Models of Code into 3 MBJieke Shi1,
Can Large Language Models Write Good Property-Based Tests?Ch
You See What I Want You to See: Poisoning Vulnerabilities in
You Autocomplete Me: Poisoning Vulnerabilities in Neural Cod
Natural Attack for Pre-trained Models of CodeZhou Yang, Jiek
Poison Attack and Poison Detection on Deep Source CodeProces
Is ChatGPT the Ultimate Programming Assistant - How far is i
Can ChatGPT Pass An Introductory Level Functional Language P
ChatGPT and Software Testing Education: Promises & PerilsSaj
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Und
An Empirical Comparison of Pre-Trained Models of Source Code
An extensive study on pre-trained models for program underst
引用Mastropaolo A, Cooper N, Palacio D N, et al. Using transfe
A critical review of large language model on software engine
签名:感谢大家的关注