Skip to content

Commit ac21fab

Browse files
author
Miltos Allamanis
committed
Merge tags
1 parent 87cbfb9 commit ac21fab

File tree

4 files changed

+4
-4
lines changed

4 files changed

+4
-4
lines changed

_publications/aye2020learning.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,6 @@ year: 2020
77
bibkey: aye2020learning
88
additional_links:
99
- {name: "ArXiV", url: "https://arxiv.org/abs/2011.04542"}
10-
tags: ["autocompletion"]
10+
tags: ["autocomplete"]
1111
---
1212
Code completion is a popular software development tool integrated into all major IDEs. Many neural language models have achieved promising results in completion suggestion prediction on synthetic benchmarks. However, a recent study When Code Completion Fails: a Case Study on Real-World Completions demonstrates that these results may not translate to improvements in real-world performance. To combat this effect, we train models on real-world code completion examples and find that these models outperform models trained on committed source code and working version snapshots by 12.8% and 13.8% accuracy respectively. We observe this improvement across modeling technologies and show through A/B testing that it corresponds to a 6.2% increase in programmers' actual autocompletion usage. Furthermore, our study characterizes a large corpus of logged autocompletion usages to investigate why training on real-world examples leads to stronger models.

_publications/kim2020code.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ bibkey: kim2020code
88
additional_links:
99
- {name: "ArXiV", url: "https://arxiv.org/abs/2003.13848"}
1010
- {name: "Code", url: "https://github.com/facebookresearch/code-prediction-transformer"}
11-
tags: ["autocompletion"]
11+
tags: ["autocomplete"]
1212
---
1313
In this paper, we describe how to leverage Transformer, a recent neural architecture for learning from sequential data (such as text), for code completion. As in the realm of natural language processing, Transformers surpass the prediction accuracy achievable by RNNs; we provide an experimental confirmation of this over a Python dataset.
1414

_publications/svyatkovskiy2020fast.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ year: 2020
77
bibkey: svyatkovskiy2020fast
88
additional_links:
99
- {name: "ArXiV", url: "https://arxiv.org/abs/2004.13651"}
10-
tags: ["autocompletion"]
10+
tags: ["autocomplete"]
1111
---
1212
Code completion is one of the most widely used features of modern integrated development environments (IDEs). Deep learning has recently made significant progress in the statistical prediction of source code. However, state-of-the-art neural network models consume prohibitively large amounts of memory, causing computational burden to the development environment, especially when deployed in lightweight client devices.
1313

_publications/svyatkovskiy2020intellicode.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ year: 2020
66
bibkey: svyatkovskiy2020intellicode
77
additional_links:
88
- {name: "ArXiV", url: "https://arxiv.org/abs/2005.08025"}
9-
tags: ["autocompletion", "generative", "synthesis", "language model", "pretraining"]
9+
tags: ["autocomplete", "generative", "synthesis", "language model", "pretraining"]
1010
---
1111
In software development through integrated development environments (IDEs), code completion is one of the most widely used features. Nevertheless, majority of integrated development environments only support completion of methods and APIs, or arguments.
1212
In this paper, we introduce IntelliCode Compose − a general-purpose multilingual code completion tool which is capable of predicting sequences of code tokens of arbitrary types, generating up to entire lines of syntactically correct code. It leverages state-of-the-art generative transformer model trained on 1.2 billion lines of source code in Python, C#, JavaScript and TypeScript programming languages. IntelliCode Compose is deployed as a cloud-based web service. It makes use of client-side tree-based caching, efficient parallel implementation of the beam search decoder, and compute graph optimizations to meet edit-time completion suggestion requirements in the Visual Studio Code IDE and Azure Notebook.

0 commit comments

Comments
 (0)