Google gifts a Free AI Coding Assistant to the developer community
70 comments
·February 27, 2025_mitterpach
ndr
This is bad advice to people with the correct posture.
If you want to learn: don't use these models to do the things for you. Do use these models to learn.
LLMs might not the best teachers in the world but they're available right there and then when you have a question. Don't take their answers for granted and test them. Ask to teach you the correct terms for things you don't know yet, so you can ask better questions.
There has never been a better time to learn. You don't have to learn in the same way your predecessors did. You can learn faster and better, but must be vigilant you're not fooling yourself.
jonplackett
I feel bad for anyone learning to code now. The temptation to just LLM it must be sooooo high.
But at the same time, having personalised StackOverflow without the negative attitude at your fingertips is super helpful, provided you also do the work of learning.
gvurrdon
> personalised StackOverflow without the negative attitude
Phrased in that way, it does sound very tempting. Over the past few years it's become pretty much a waste of time to post on SO (well, in my experience, anyway).
soco
But wouldn't you learn if you actually have to enter and test that code, even if it's LLM generated, every day? Maybe you learn bad models, which can happen from SO as well, but you do learn. I'm more worried that the juniors won't have this opportunity anymore, or rather, they won't be needed anymore. So when I retire, what? Unless AI gets better and replaces everybody, then it won't matter at all what and how you learned.
ndr
The non-curious/non-skeptics people might have better luck learning by staying with books and coding offline.
For the curious/skeptics there's never been a better moment to pick anything up. I don't know how we make more people more curious and skeptic.
singularity2001
Don't feel bad: LLMs make it so much easier to learn thinks without getting stuck at frustrating nonsense, yet there remain enough hurdles so that you need to develop resilience.
myaccountonhn
The errors and inefficiencies LLMs make are very subtle. You’re also just stuck with whatever it’s trained at. I echo OP, learn from documentation and code. This is as true now as back when Stackoverflow was used for everything.
fredgrott
I had this argument with Mark Cuban recently and he admitted he was wrong, my example was using a calculator to learn calculus....
We are forgetting what general purpose name means to learning and real practical usage as a tool.
BTW the best AI and Computer Science discussions are happening on bluesky
azemetre
Who are these people on Bluesky?
ndr
I can't tell what were your respective positions. In many ways
LLMs : computer science = [opposite of calculator] : calculus
A calculator will not let you discover terms for you to BFS/DFS knowledge on.It will not help find the terms of art for fuzzy concepts you encountered and can barely describe.
It is not a learning accelerator.
worthless-trash
Having seen the code they make, don't learn from these models. Remember they are trained on a lot of code, not a lot of good code.
joseda-hg
They don't neccesarily do, but you can get them pretty far, one of the most interesting parts of LLM's (I guess chat based, haven't tried copilot style ones well enough) is that smallish rewrites are really low cost
Don't like the way it did something convoluted or didn't do early returns? say it, it will do it, chain as many requests as you need, it won't get fed up with you, and if you see it losing detail because memory, use those requests for a significantly more polished prompt on a new chat for a cleaner starting point
Don't just accept what it gives you either
desdenova
Use documentation and real world code to learn, not slop.
keyle
It's interesting you write this. I have a long experience and I use these auto complete on drugs now... I can't see myself writing all the damn code myself anymore.
I remember the days of using books, having to follow the code bits in the book as I typed them. I don't remember diddly squat about it. Same from years of stack overflow. I'd just alt-tab 12 times, read the comments, then read another answer, assess the best answer. Massive waste of time.
Use all the technology you have at your hands I say. But be sure to understand what you auto-completed. If not, stop and learn.
pdimitar
> But be sure to understand what you auto-completed. If not, stop and learn.
But that's IMO exactly what your parent commenter says. Use LLMs only after you actually have a clue what are they producing. So if you are a beginner, basically don't because you'll not have any understanding.
nchagnet
Agreed with you, although I'd say there is a spectrum between "do everything the hardest way" and "don't learn anything". I think LLM-based tools can be a great time-saver for boilerplate repetitive code, and they can sometimes help you get a first draft of some implementation you're not fully seeing yet, but you definitely should not rely on them to write the whole code for you if you want to learn.
infecto
Nope, the theme of the parent was...
>Don't use Copilot, Gemini, Cursor or any other code assisting tool for the several first years of your study or career.
I totally disagree with it too and think its no different than using a book or SO from the past. As a Junior you copy paste many more lines of codes that you don't full appreciate and sometimes it simply just takes time of doing that to absorb the knowledge.
Drakim
I agree, but I'll add that you can still use a standalone LLM window as a "teacher". If you don't know how to do something, ask it how to do it, and make it explain every piece of what's going on so you truly absorb it. But don't let it write the code FOR you, you should implement it yourself.
BLubliulu
I think this is not a good idea or suggestion at all.
If i use google maps to find my way around, i'm faster by a lot. I also do remember things despite google maps doing the work for me.
Use Code assistent as much as possible but make sure to read and understand what you get and try to write it yourself even if you just write it from a second window.
In this age and pace, writing code will change relevant in the next few years anyway
Rebuff5007
You (a SWE) using google maps is a bit different than a cab driving relying on google maps. The former is fine, the latter could be concerning.
I'd argue that coding assistants for a SWE hinders their ability to build subject matter expertise.
pinoy420
[dead]
icepat
I tend to agree. But will extend beyond computer science students and say especially people who are self-learning. When I was getting started, I actively tried to minimize the number of packages, and abstraction tools I used. Consequentially, I was able to properly and deeply understand how things worked. Only once I was able to really explain, and use a tool, would I accept an automated solution for it.
On the flip side, I've now found that getting AI to kick the tires on something I'm not super well versed in, helps me figure out how it works. But that's only because I understand how other things work.
If you're going to use AI in your learning, I think the best way you can do that is ask it to give you an example, or an incomplete implementation. Then you can learn in bits while still getting things done.
prismatix
I just interviewed someone for a Senior position who's been using these AI copilots for 1.5 years as a contractor. In the interview I politely said I wanted to evaluate their skills without AI, so no Cursor/Copilots allowed. They did not remember how to map through an array, define a function, add click/change handlers to input, etc.
sachinjain
Is this a good evaluation criterion anymore? Also, did you allow the candidate to use the internet?
Most of us do not remember the exact syntax for everything despite having coded in that language/framework for years.
lsaferite
It's an interesting space for discussion.
What I've found after developing software for many decades and learning many languages is that the concepts and core logical thinking are what is most important in most cases.
Before the current AI boom I would still have had a problem doing some tasks in a vacuum as well. Not because I was incapable, but because I had so much other relevant information in my head that the minutia of some tasks was irrelevant when I had immediate access to the needed information via auto-complete in an IDE and language documentation. I know what I needed to look up because of all that other knowledge in my head though. I knew things were possible. And in cases where I didn't _know_ something was possible, I had an inkling that something might be possible because I could do it in another language or it was a logical extension of some other concept.
With the current rage of AI Coding Copilots I personally feel like many people are going down a path that degrades that corpus of general knowledge that drives the ability to solve problems quickly. Instead they lean on the coding assistant to have that knowledge and simple direct it to do the tasks at a macro level. On the surface this may seem like a universal boon, but the reality is they are giving up that intrinsic domain knowledge that is needed to be functional at understanding what software is doing and how to solve the problems that will crop up.
If those two paragraphs seem contradictory in some manner, I agree. You can argue that leaning on IDE syntax autocomplete and looking up documentation not foundationally different than leaning on a coding assistant. I can only say that they don't _feel_ the same to me. Maybe what I mean is, if the assistant is writing code and you are directly using it, then you never gain knowledge. If you are looking things up in documentation or using auto-complete for function names or arguments, you are learning about the code and how to solve a problem. So maybe it's just, what abstraction level are we, as a profession, comfortable with?
To close out this oddly long comment, I personally use LLMs and other ML models frequently. I have found that they are excellent at helping me formulate my thoughts on a problem that needs to be solved and to surface information across a lot of sources into a coherent understanding of an issue. Sure, it's possible that it's wrong, but I just use it to help steer me towards the real information I need. If I ask for or it provides code, that's used as a reference implementation for the actual implementation I write. And my IDE auto-complete has gotten a boost as well. It's much better at understanding the context of what I'm writing and providing guesses as to what I'm about to type. It's quite good. Most of the time. But it's also wrong in very subtle ways that require careful reading to notice. And I'll sum this paragraph up with the fact that I'm turning to an LLM more and more as a first search before I hit a search engine (yet I hate Google's AI search results).
prismatix
The situation opened up a very interesting discussion on our team. All of us on the team use AI tools in our job (you'd be a fool not to these days). I even use the copilot tool that the candidate used. But the difference is that I don't rely on it, and any code it produces I'm actively registering in my head. I would never let it write something that I don't understand without taking the time to understand it myself.
I do agree though. Why do intellisense and copilots feel so different from one another? I think part of it is that with intellisense you generally need to start the action before it auto suggests, whereas with copilots you don't even need to initiate the action.
reducesuffering
Don't use garbage collection or high level dynamic typing when building web servers. It's important to understand what the machine is actually doing at a low-level. Implementing REST API's in C++ will write code slower than others, sure, but you'll gain so much in your fundamentals of how memory management and OS processes work.
roenxi
The more direct comparison might be "don't use compilers for the first few years; learn assembly directly instead". LLMs aren't going away, it doesn't make sense to learn how to do things LLMs already do now and are only going to get better at.
xnx
If you're a student who is thinking of not using LLM's, don't.
You put yourself at a significant disadvantage by not availing yourself of an infinitely patient, non-judgemental, and thoroughly well read tutor/coach.
xnx
I thought it used to be courtesy for blogs to link to the source?
https://blog.google/technology/developers/gemini-code-assist...
smallerfish
Here's the terms of service for the plugin: https://developers.google.com/gemini-code-assist/resources/p...
This says nothing about Google's use of the data. Anybody have a better link?
graphGL
https://developers.google.com/gemini-code-assist/resources/p...
Data included in their training by default. You’ll need to opt-out.
terminalbraid
Apropos nothing, Annatar was a very good disguise for Sauron to use to gain the elves' trust but secretly have them forge something for himself.
WithinReason
Unrelatedly, Google uses which completions users accept to train their AI further.
pdimitar
Curious to try this as most other tools have fairly stingy free tiers where you basically have one good interaction and boom, now you have to pay.
(Zero analogies to drug dealers. Wink wink.)
So can this be used standalone or must you use JetBrains / VSCode / Firebase / GitHub with no other option? I am not seeing any.
FergusArgyll
I think only for those editors. Google does also give a free API Key (aistudio.google.com) for the underlying models (though not the coding fine-tuned one) But IMO the free tier is rate limited a bit too much to build your own extension out of it for free.
TiredOfLife
Codeium free autocomplete is free
tokai
Free like Google Search is free, I presume?
sachinjain
This weekend, I tried out Gemini Code Assist and compared it to my usual AI assistant, Cursor IDE. Shared my findings here [0]
[0]: https://sachinjain.substack.com/p/ai-coding-assistant-gemini...
skerit
As usual for Google services, where other providers just make you sign in & you're set, Google requires you to create a specific "Cloud project" and then makes you look through the menus to specifically enable the "Gemini Code Assist" feature.
hahn-kev
Not for the free tier. I set it up today and didn't have to do anything.
infecto
Look no further than the two different SDKs and APIs to access all the LLM tools.
mrtksn
I don't believe that this is the future of computer programming, all those coding assistants feel like companies trying to make mechanical horses and caravans when the combustion engine was invented.
IMHO they should be inventing cars, planes and trains.
Why? Because they write code using tools made to accommodate people and when you are taking out people from the loop keeping those is useless. It's especially evident when those AI tools import bazillion of libraries that exist only to help humans tot reinventing and spending time solved problems and provide comfort when coding.
The AI programming tools are not like humans, they shouldn't be using tools made for humans and instead solve the tasks directly. IE, if you are making a web UI the AI tool doesn't have to import tons of libraries to center a div and make it pretty. It should be able to directly write the code that centers that and humans probably shouldn't be looking at the code at all.
dmalik
When is the last time you tried Cursor? It definitely doesn't import libraries. It can if you prompt it to but you have control.
I find it works great it you prompt it one step at a time. It's still can be iterative but it allows you to tighten up the code as you go.
Yes you can still go yolo mode and get some interesting prototypes if you just need to show someone something fast but if you know what you're doing it just saves time.
mrtksn
It has been some time since my last try but importing libraries isn't my core concern.
I still feel more comfortable with the chat interface where I talk to LLM and make it generate the code that I end up putting together in a dumb editor because I'm still writing code for human based analysis that will be interpreted by machine and my claim is that if the code is actually to be written by machine for the consumption of a machine, then the human should be out of the code creation loop completely and fully assume the role of someone who demands stuff and knows when its done right and doesn't bother with the code itself.
darkerside
That would be the case of this were actually AI. But LLMs actually do procedurally generate language in a fairly human way, so using human languages makes sense to me.
mrtksn
The problem with this is that we can't get rid of the baggage of higher and higher level of libraries and languages that exist to accommodate humans.
I agree that it makes sense to use these currently but IMHO the ultimate programming will be human readable code-free, instead the AI will create a representation of the algorithm we need and will execute around it.
Like having an idea, for example if you need to program a robot vacuum cleaner you should be able to describe how you describe it to behave and the AI will create an algorithm(an idea, like "let's turn when bump into a wall then try again") and constantly tweak it and tend it. We wouldn't be directly looking at the code the AI wrote, instead we can test it and find edge cases that a machine maybe wouldn't predict(i.e. the cat sits on the robot and blocking the sensors).
darkerside
What makes the AI context window immune to the same issues that plague us humans? I think they will still benefit from high level languages and libraries. AIs that can use them will be able to manage larger and more complex systems than the ones that only use low level languages.
croes
Timeō Danaōs et dōna ferentēs.
MonkeyClub
Well said.
Particularly, "Data excluded from training by default" is not available in the free and first paid tier.
Google was obviously irked that Microsoft got all this juicy training data since everyone is on their walled git garden, and tried to come up with a way to also dip some fingers into said data.
You should see the reviews in the JetBrains plugin page: https://plugins.jetbrains.com/plugin/24198-gemini-code-assis...
People are all so "shut up and take my money", but the bloody thing won't even sign them in.
But it's still in beta, right? Perhaps it'll start working in a couple more months.
ChrisArchitect
Previously on official post: https://news.ycombinator.com/item?id=43170626
JoshTriplett
"Free as in puppy"
If you're a computer science student who is thinking of using this, don't.
Don't use Copilot, Gemini, Cursor or any other code assisting tool for the several first years of your study or career. You will write code slower than others, sure, but what you'll learn and what you build will be a hundred times more useful for you than when you just copy and paste 'stuff' from AI.
Invest in fundamentals, learn best practices, be curious.