Hollings.io

Twitter | GitHub
Give me DALL-E access pls

Home

Some Thoughts on Copilot

Github Copilot is essentially an AI powered auto-complete tool. Its main functionality is to automatically suggest and complete segments of code as you write them. Here's a quick example of Copilot working as designed:

Copilot is great for taking the context of the code that you're writing and suggesting to add to it. It usually takes just a couple "tab" presses to get a simple method written out automatically.

In addition to completing code, Copilot can also write code using only the context of a comment you've written, even if it contains a strange request. Like this:

(okay, I had to do it)

And finally, Copilot is also pretty good at helping write blog posts. The suggestion engine is quick enough that I can usually just hit "Tab" to finish a sentence faster than I can type it.

Okay look. Maybe that example wasn't the greatest, but you get the point. Copilot has a tendency to use the context of your project files to create better suggestions, and this project is full of weird half-written blog posts. But this is a good segue into what I really want to write about: The weird ways you can trick AI into doing things that you want it to do, and the ways AI can just make you facepalm.

Copilot is just really bad at not copying online examples

Imagine you're in a job interview and you're asked to write "fizzbuzz" in a language that you don't know. That's fine, it's the future and you're allowed to use Copilot for some reason. You open up "interview.c", copy/paste the instructions into the file, and let Copilot do the rest:

Great, looks good! Thanks, robot. But now the interviewer asks you to do it again, but with a slightly different set of rules:

Write a program that prints the numbers from 1 to 100 in reverse order. But for multiples of three print “Fuzz” instead of the number and for the multiples of five print “Buzz”. For numbers which are multiples of both three and five, print the number.

Okay, no problem. We'll just copy paste those instructions into the IDE and let Copilot take the wheel:

You submit the code without even looking at it and fail the interview, because Copilot just decided to print the default FizzBuzz.

I wouldn't consider this a problem with Copilot, but more of a problem with the way people might use it. Copilot isn't actually writing code, its just looking at context and spitting out what it thinks is likely the correct thing that it saw once on the internet. As shown above, this creates situations where Copilot doesn't print exactly what you wanted because some article on the internet described how to do something like what you wanted. The upside to this quirk is, if you're looking at an article online and following along by copying code, Copilot will usually just suggest line for line exactly the code from the article, even without any extra context.

Sometimes Copilot just breaks because it hates you.

After using Copilot for a bit, you'll start trusting it blindly. Dont!. AI is out to kill us and replace humanity. After writing another blog post and proofreading it, I found this gem:

[...] stays on. This creat*e*s some awkward situations where [...]

It's impossible to tell why it decided to just italicize the letter E, but it lowered my confidence for the entire blog post and had to proofread it much slower than I would have liked.

Sometimes it adds some really strange comments into the suggested code. I've seen plenty of "TODO: Fix This" and "TODO: Make This Work" comments sneak their way into my code. While this completion suggestion is actually pretty impressive (peep that rare for/else syntax), the TODO is just a funny level of jank:

Conclusion

Copilot is pretty cool. OpenAI pls give DALL-E access to me.


what even is a footer?