Google gifts a Free AI Coding Assistant to the developer community
53 comments
·February 27, 2025_mitterpach
ndr
This is bad advice to people with the correct posture.
If you want to learn: don't use these models to do the things for you. Do use these models to learn.
LLMs might not the best teachers in the world but they're available right there and then when you have a question. Don't take their answers for granted and test them. Ask to teach you the correct terms for things you don't know yet, so you can ask better questions.
There has never been a better time to learn. You don't have to learn in the same way your predecessors did. You can learn faster and better, but must be vigilant you're not fooling yourself.
jonplackett
I feel bad for anyone learning to code now. The temptation to just LLM it must be sooooo high.
But at the same time, having personalised StackOverflow without the negative attitude at your fingertips is super helpful, provided you also do the work of learning.
gvurrdon
> personalised StackOverflow without the negative attitude
Phrased in that way, it does sound very tempting. Over the past few years it's become pretty much a waste of time to post on SO (well, in my experience, anyway).
soco
But wouldn't you learn if you actually have to enter and test that code, even if it's LLM generated, every day? Maybe you learn bad models, which can happen from SO as well, but you do learn. I'm more worried that the juniors won't have this opportunity anymore, or rather, they won't be needed anymore. So when I retire, what? Unless AI gets better and replaces everybody, then it won't matter at all what and how you learned.
ndr
The non-curious/non-skeptics people might have better luck learning by staying with books and coding offline.
For the curious/skeptics there's never been a better moment to pick anything up. I don't know how we make more people more curious and skeptic.
myaccountonhn
The errors and inefficiencies LLMs make are very subtle. You’re also just stuck with whatever it’s trained at. I echo OP, learn from documentation and code. This is as true now as back when Stackoverflow was used for everything.
worthless-trash
Having seen the code they make, don't learn from these models. Remember they are trained on a lot of code, not a lot of good code.
joseda-hg
They don't neccesarily do, but you can get them pretty far, one of the most interesting parts of LLM's (I guess chat based, haven't tried copilot style ones well enough) is that smallish rewrites are really low cost
Don't like the way it did something convoluted or didn't do early returns? say it, it will do it, chain as many requests as you need, it won't get fed up with you, and if you see it losing detail because memory, use those requests for a significantly more polished prompt on a new chat for a cleaner starting point
Don't just accept what it gives you either
desdenova
Use documentation and real world code to learn, not slop.
fredgrott
I had this argument with Mark Cuban recently and he admitted he was wrong, my example was using a calculator to learn calculus....
We are forgetting what general purpose name means to learning and real practical usage as a tool.
BTW the best AI and Computer Science discussions are happening on bluesky
keyle
It's interesting you write this. I have a long experience and I use these auto complete on drugs now... I can't see myself writing all the damn code myself anymore.
I remember the days of using books, having to follow the code bits in the book as I typed them. I don't remember diddly squat about it. Same from years of stack overflow. I'd just alt-tab 12 times, read the comments, then read another answer, assess the best answer. Massive waste of time.
Use all the technology you have at your hands I say. But be sure to understand what you auto-completed. If not, stop and learn.
pdimitar
> But be sure to understand what you auto-completed. If not, stop and learn.
But that's IMO exactly what your parent commenter says. Use LLMs only after you actually have a clue what are they producing. So if you are a beginner, basically don't because you'll not have any understanding.
nchagnet
Agreed with you, although I'd say there is a spectrum between "do everything the hardest way" and "don't learn anything". I think LLM-based tools can be a great time-saver for boilerplate repetitive code, and they can sometimes help you get a first draft of some implementation you're not fully seeing yet, but you definitely should not rely on them to write the whole code for you if you want to learn.
infecto
Nope, the theme of the parent was...
>Don't use Copilot, Gemini, Cursor or any other code assisting tool for the several first years of your study or career.
I totally disagree with it too and think its no different than using a book or SO from the past. As a Junior you copy paste many more lines of codes that you don't full appreciate and sometimes it simply just takes time of doing that to absorb the knowledge.
icepat
I tend to agree. But will extend beyond computer science students and say especially people who are self-learning. When I was getting started, I actively tried to minimize the number of packages, and abstraction tools I used. Consequentially, I was able to properly and deeply understand how things worked. Only once I was able to really explain, and use a tool, would I accept an automated solution for it.
On the flip side, I've now found that getting AI to kick the tires on something I'm not super well versed in, helps me figure out how it works. But that's only because I understand how other things work.
If you're going to use AI in your learning, I think the best way you can do that is ask it to give you an example, or an incomplete implementation. Then you can learn in bits while still getting things done.
prismatix
I just interviewed someone for a Senior position who's been using these AI copilots for 1.5 years as a contractor. In the interview I politely said I wanted to evaluate their skills without AI, so no Cursor/Copilots allowed. They did not remember how to map through an array, define a function, add click/change handlers to input, etc.
Drakim
I agree, but I'll add that you can still use a standalone LLM window as a "teacher". If you don't know how to do something, ask it how to do it, and make it explain every piece of what's going on so you truly absorb it. But don't let it write the code FOR you, you should implement it yourself.
DrScientist
Same sort of advice as don't copy verbatim to write your essays - ie for you to learn it has to flow through your brain.
However the above advice for essays doesn't include not looking at textbooks or papers - just not to blindly copy.
So perhaps you should use coding assistents - but always in a mode where you use as a source to write yourself rather than cut and paste/direct editing.
BLubliulu
I think this is not a good idea or suggestion at all.
If i use google maps to find my way around, i'm faster by a lot. I also do remember things despite google maps doing the work for me.
Use Code assistent as much as possible but make sure to read and understand what you get and try to write it yourself even if you just write it from a second window.
In this age and pace, writing code will change relevant in the next few years anyway
Rebuff5007
You (a SWE) using google maps is a bit different than a cab driving relying on google maps. The former is fine, the latter could be concerning.
I'd argue that coding assistants for a SWE hinders their ability to build subject matter expertise.
pinoy420
[dead]
kace91
I think there is a lot of room to leverage it, though resisting the temptation to have it do your work.
You can get it to provide feedback on code quality, suggest refractors and its reasoning (with the explanation rather than the full solution), basically treating it as an always available study group.
There is probably room for a course or book on a methodology that allows students to engage in this practice, or models with prompts that forbid straight completion and just provide help aimed at students.
xnx
I thought it used to be courtesy for blogs to link to the source?
https://blog.google/technology/developers/gemini-code-assist...
smallerfish
Here's the terms of service for the plugin: https://developers.google.com/gemini-code-assist/resources/p...
This says nothing about Google's use of the data. Anybody have a better link?
terminalbraid
Apropos nothing, Annatar was a very good disguise for Sauron to use to gain the elves' trust but secretly have them forge something for himself.
pdimitar
Curious to try this as most other tools have fairly stingy free tiers where you basically have one good interaction and boom, now you have to pay.
(Zero analogies to drug dealers. Wink wink.)
So can this be used standalone or must you use JetBrains / VSCode / Firebase / GitHub with no other option? I am not seeing any.
FergusArgyll
I think only for those editors. Google does also give a free API Key (aistudio.google.com) for the underlying models (though not the coding fine-tuned one) But IMO the free tier is rate limited a bit too much to build your own extension out of it for free.
TiredOfLife
Codeium free autocomplete is free
tokai
Free like Google Search is free, I presume?
skerit
As usual for Google services, where other providers just make you sign in & you're set, Google requires you to create a specific "Cloud project" and then makes you look through the menus to specifically enable the "Gemini Code Assist" feature.
infecto
Look no further than the two different SDKs and APIs to access all the LLM tools.
hahn-kev
Not for the free tier. I set it up today and didn't have to do anything.
mrtksn
I don't believe that this is the future of computer programming, all those coding assistants feel like companies trying to make mechanical horses and caravans when the combustion engine was invented.
IMHO they should be inventing cars, planes and trains.
Why? Because they write code using tools made to accommodate people and when you are taking out people from the loop keeping those is useless. It's especially evident when those AI tools import bazillion of libraries that exist only to help humans tot reinventing and spending time solved problems and provide comfort when coding.
The AI programming tools are not like humans, they shouldn't be using tools made for humans and instead solve the tasks directly. IE, if you are making a web UI the AI tool doesn't have to import tons of libraries to center a div and make it pretty. It should be able to directly write the code that centers that and humans probably shouldn't be looking at the code at all.
dmalik
When is the last time you tried Cursor? It definitely doesn't import libraries. It can if you prompt it to but you have control.
I find it works great it you prompt it one step at a time. It's still can be iterative but it allows you to tighten up the code as you go.
Yes you can still go yolo mode and get some interesting prototypes if you just need to show someone something fast but if you know what you're doing it just saves time.
mrtksn
It has been some time since my last try but importing libraries isn't my core concern.
I still feel more comfortable with the chat interface where I talk to LLM and make it generate the code that I end up putting together in a dumb editor because I'm still writing code for human based analysis that will be interpreted by machine and my claim is that if the code is actually to be written by machine for the consumption of a machine, then the human should be out of the code creation loop completely and fully assume the role of someone who demands stuff and knows when its done right and doesn't bother with the code itself.
darkerside
That would be the case of this were actually AI. But LLMs actually do procedurally generate language in a fairly human way, so using human languages makes sense to me.
mrtksn
The problem with this is that we can't get rid of the baggage of higher and higher level of libraries and languages that exist to accommodate humans.
I agree that it makes sense to use these currently but IMHO the ultimate programming will be human readable code-free, instead the AI will create a representation of the algorithm we need and will execute around it.
Like having an idea, for example if you need to program a robot vacuum cleaner you should be able to describe how you describe it to behave and the AI will create an algorithm(an idea, like "let's turn when bump into a wall then try again") and constantly tweak it and tend it. We wouldn't be directly looking at the code the AI wrote, instead we can test it and find edge cases that a machine maybe wouldn't predict(i.e. the cat sits on the robot and blocking the sensors).
croes
Timeō Danaōs et dōna ferentēs.
MonkeyClub
Well said.
Particularly, "Data excluded from training by default" is not available in the free and first paid tier.
Google was obviously irked that Microsoft got all this juicy training data since everyone is on their walled git garden, and tried to come up with a way to also dip some fingers into said data.
You should see the reviews in the JetBrains plugin page: https://plugins.jetbrains.com/plugin/24198-gemini-code-assis...
People are all so "shut up and take my money", but the bloody thing won't even sign them in.
But it's still in beta, right? Perhaps it'll start working in a couple more months.
barotalomey
No thanks!
JoshTriplett
"Free as in puppy"
If you're a computer science student who is thinking of using this, don't.
Don't use Copilot, Gemini, Cursor or any other code assisting tool for the several first years of your study or career. You will write code slower than others, sure, but what you'll learn and what you build will be a hundred times more useful for you than when you just copy and paste 'stuff' from AI.
Invest in fundamentals, learn best practices, be curious.