Experimenting with GitHub Copilot

Since this is my first post since before projects were selected, it would be good to get readers up to speed.

I did get placed with the two team members I coordinated with before class (in addition to three others) and we got our first choice project: SaaS Application for Fire Department 911 Risk Analysis for our sponsor, Levrum Data Technologies.

The application calculates and visualizes response times for networks of fire stations.

A detail of the existing UI showing the fire station response time map visualization for Corvallis, Oregon; credit: Levrum Data Technologies

This is a continuation of an existing OSU project. A prototype of the server and browser-based front end already exist. Our team is continuing that work by implementing a database and a control-plane. There are a few issues with that:

  • Only one member of our team has prior experience with .NET & C#
  • The existing code is very well modularized but that means it’s very interconnected
  • The existing code is minimally commented
  • There is very little external documentation

In order to understand and document the code, we’ve employed traditional tools like auto-generated code maps. I’ve also experimented with GitHub Copilot to attempt to speed up the process.

Here are my findings:

People who have spent much time interacting with Large Language Models (LLMs) and generative AI will point out that it’s sometimes necessary to change the wording of requests to get what you want (especially if you’re trying to bypass security features with a prompt injection).

This remains true with GitHub Copilot. But, for certain functions, the precise details of the prompts are invisible to the user. You have to experiment with canned choices by selecting varying lengths and sections of code.

When highlighting a class method and selecting “Explain the selected code,” I’ve found the generated text is generally accurate but not necessarily easier to understand than the code itself. It doesn’t reach outside the highlighted code for context. It will tell you about an argument and its type but it won’t explain what else in the code base might call the method and pass the argument.

In fact, when I tried to use the chat to prompt Copilot to do so, I was informed it does not have access to the file structure. Of course that makes sense; the LLM isn’t on my workstation.

However, highlighting a larger block, like a class with multiple methods, is much more useful. Copilot will provide a formatted breakdown of the class.

A partial example of an explanation of a C# class generated by GitHub Copilot; credit Microsoft

It even provides a helpful prompt suggestion:

Copilot suggestion to define the Strategy design pattern; credit Microsoft

There’s also a plugin for Visual Studio Code only called GitHub Copilot Labs. It’s for experimental Copilot features and that’s apparent when you use it.

The options look promising and helpful:

Code modification options provided by the GitHub Copilot Labs plugin ; credit Microsoft

In practice, you can get different, usually broken results every time you try one. Below is an example of the automatic documentation. Not only did it repeat the same two lines 10 times, it provided an incorrect definition of coverage and broke the code by duplicating the class definition line.

Copilot Labs automatic documentation may create good comments or it may do this; credit Levrum

So, Labs is experimental, as advertised. It may be interesting to play with but it’s not a tool to use for development.

Conclusions

I haven’t had a chance to try the main attraction: pair coding with Copilot. From what I’ve read, it’s useful except when it’s not.

The basic functions of GitHub Copilot are quite useful once you understand how they are best employed. There’s just a very slight learning curve. The Labs functions may be useful in the future if they ever work.

It’s important to realize that LLMs don’t understand anything. They’re statistical inference engines. They can help speed things up a lot but you always have to check their work. To be fair, that’s also true of people. Hopefully not the understanding part, in most cases.