<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Antigravity on Melabit</title>
    <link>https://static.233.196.69.159.clients.your-server.de/en/tags/antigravity/</link>
    <description>Recent content in Antigravity on Melabit</description>
    <generator>Hugo</generator>
    <language>en</language>
    <lastBuildDate>Mon, 16 Mar 2026 18:00:00 +0000</lastBuildDate>
    <atom:link href="https://static.233.196.69.159.clients.your-server.de/en/tags/antigravity/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Antigravity: the LLM does it better</title>
      <link>https://static.233.196.69.159.clients.your-server.de/en/2026/03/16/antigravity-the-llm-does-it-better/</link>
      <pubDate>Mon, 16 Mar 2026 18:00:00 +0000</pubDate>
      <guid>https://static.233.196.69.159.clients.your-server.de/en/2026/03/16/antigravity-the-llm-does-it-better/</guid>
      <description>&lt;p&gt;Over the last few weeks, I’ve been writing about my experiments with Antigravity, or rather, with the (more or less) &lt;em&gt;intelligent agents&lt;/em&gt; integrated into the editor. The results have been mixed: sometimes &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/&#34;&gt;the agents proved to be very effective&lt;/a&gt;, accurately easing some complex or repetitive tasks; in other cases &lt;a href=&#34;https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/&#34;&gt;they didn&amp;rsquo;t accomplish anything worthwhile&lt;/a&gt;, only wasting a huge amount of time.&lt;/p&gt;&#xA;&lt;h4 id=&#34;coding-badly-with-an-llm&#34;&gt;Coding badly with an LLM&lt;/h4&gt;&#xA;&lt;p&gt;Among all the more or less useful applications of LLMs, one of the most controversial is the idea that anyone can use an LLM to program, even when they don&amp;rsquo;t know the first thing about it. This is the so-called &lt;em&gt;vibe coding&lt;/em&gt;, a concept that can be &lt;a href=&#34;https://x.com/karpathy/status/1886192184808149383&#34;&gt;summarized more or less like this&lt;/a&gt;: &amp;ldquo;I just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works&amp;rdquo;. This is not a quote from &lt;a href=&#34;https://www.youtube.com/watch?v=-w9EBTB8a3Y&#34;&gt;Nanni Moretti&lt;/a&gt;, a famous italian actor and director; it was reported by Andrej Karpathy, one of the founders of OpenAI.&lt;/p&gt;&#xA;&lt;p&gt;Now, setting aside the many issues regarding security, the maintainability of code produced without really knowing what it does, and the &lt;a href=&#34;https://taylorandfrancis.com/knowledge/Engineering_and_technology/Engineering_support_and_special_topics/Regression_to_the_mean/&#34;&gt;regression to the mean&lt;/a&gt;, which here implies that the quality of AI-produced code will inevitably tend to collapse, it only takes a bit of personal experimentation, even with simple projects, to realize that the very idea of &lt;em&gt;coding for everyone&lt;/em&gt; doesn&amp;rsquo;t work. If you want to get something good out of AI, you have to know exactly what you want and how you want it.&lt;/p&gt;&#xA;&lt;p&gt;I tried &lt;em&gt;vibe coding&lt;/em&gt; myself: I asked Gemini to &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;write a driver for an Arducam camera&lt;/a&gt;, explaining only the bare essentials and waiting for it to do everything on its own.&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;However, this approach did not work. It took hours upon hours of work, errors, inconclusive answers, corrections, and advice from me to reach an acceptable result.&lt;/p&gt;&#xA;&lt;p&gt;And this was only possible because I eventually abandoned the idea of letting Gemini do everything and provided it with specific advice on how to proceed. Without that, it would have only been a massive waste of time.&lt;/p&gt;&#xA;&lt;div style = &#34;border-style: solid; border-width: 0px 0px 0px 12px; border-color: #03a87c; background-color: #defef6; padding: 1em; color:black; font-size:100%;&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://emeralit-muloblog.statichost.page/&#34;&gt;Mimmo&lt;/a&gt; also tried, on his own, to have &lt;a href=&#34;https://lumo.proton.me&#34;&gt;Lumo&lt;/a&gt; write a Micropython program to manage my Arducam Mini 5MP Plus with the Pico. Even Lumo, which is undoubtedly one of the best chatbots around, couldn&amp;rsquo;t do it.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/br&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/Gemini_Generated_Image_jb08gsjb08gsjb08.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Image generated by &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;coding-better-with-the-same-llm&#34;&gt;Coding better with the same LLM&lt;/h4&gt;&#xA;&lt;p&gt;This doesn&amp;rsquo;t mean that an LLM can&amp;rsquo;t be useful when programming; you just have to use it properly. You must explain exactly what you want it to do, provide all the necessary materials, and set boundaries within which it must operate.&lt;/p&gt;&#xA;&lt;p&gt;In short, you have to treat it like a graduate student or an intern, with little experience but a great desire to learn.&lt;/p&gt;&#xA;&lt;p&gt;It only took two carefully written prompts to make a difference. The first defined the project&amp;rsquo;s directory structure and instructed the agent to use &lt;code&gt;git&lt;/code&gt; for version control and &lt;code&gt;uv&lt;/code&gt; to install any necessary Python packages,&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;Start a new project and name it &lt;code&gt;project13-pico-camera5mp-reprise&lt;/code&gt;.&lt;/p&gt;&#xA;    &lt;p&gt;Read the &lt;code&gt;AGENTS.md&lt;/code&gt; file that defines your basic behaviour.&lt;/p&gt;&#xA;    &lt;p&gt;Create a new directory with this name and initialize an empty &lt;code&gt;git&lt;/code&gt; repository using &lt;code&gt;git init&lt;/code&gt;&#xA;    Use &lt;code&gt;uv&lt;/code&gt; for managing Python packages.&#xA;    All the code must stay in the &lt;code&gt;code&lt;/code&gt; directory, with no subdirectories within it.&#xA;    This project does not need a &lt;code&gt;data&lt;/code&gt; directory and the &lt;code&gt;output&lt;/code&gt; directory must be renamed as &lt;code&gt;images&lt;/code&gt;.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;while the second prompt contained everything I could think of to help it work.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;I have a Rasperry Pi Pico 2W connected to a Arducam 5MP Plus OV5642 Mini Module Camera Shield (&lt;a href=&#34;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html%29&#34;&gt;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html)&lt;/a&gt;.&lt;/p&gt;&#xA;    &lt;p&gt;I have connected the camera to the Pico using the standard layout found in the documentation:&lt;/p&gt;&#xA;    &lt;table&gt;&#xA;      &lt;thead&gt;&#xA;          &lt;tr&gt;&#xA;              &lt;th&gt;Camera&lt;/th&gt;&#xA;              &lt;th&gt;CS&lt;/th&gt;&#xA;              &lt;th&gt;MOSI&lt;/th&gt;&#xA;              &lt;th&gt;MISO&lt;/th&gt;&#xA;              &lt;th&gt;SCK&lt;/th&gt;&#xA;              &lt;th&gt;GND&lt;/th&gt;&#xA;              &lt;th&gt;VCC&lt;/th&gt;&#xA;              &lt;th&gt;SDA&lt;/th&gt;&#xA;              &lt;th&gt;SCL&lt;/th&gt;&#xA;          &lt;/tr&gt;&#xA;      &lt;/thead&gt;&#xA;      &lt;tbody&gt;&#xA;          &lt;tr&gt;&#xA;              &lt;td&gt;Pico&lt;/td&gt;&#xA;              &lt;td&gt;GP5&lt;/td&gt;&#xA;              &lt;td&gt;GP3&lt;/td&gt;&#xA;              &lt;td&gt;GP4&lt;/td&gt;&#xA;              &lt;td&gt;GP2&lt;/td&gt;&#xA;              &lt;td&gt;GND&lt;/td&gt;&#xA;              &lt;td&gt;3V3&lt;/td&gt;&#xA;              &lt;td&gt;GP8&lt;/td&gt;&#xA;              &lt;td&gt;GP9&lt;/td&gt;&#xA;          &lt;/tr&gt;&#xA;      &lt;/tbody&gt;&#xA;    &lt;/table&gt;&#xA;    &lt;p&gt;The main documentation for the camera can be found at these links:&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Introduction/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Introduction/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Hardware/Arducam-Shield-Mini-5MP-Plus/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Hardware/Arducam-Shield-Mini-5MP-Plus/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Software/Quick-Start-Guide/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Software/Quick-Start-Guide/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Pico/Camera-Module/SPI-Camera/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Pico/Camera-Module/SPI-Camera/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://www.uctronics.com/download/Image_Sensor/OV5642_DS.pdf&#34;&gt;https://www.uctronics.com/download/Image_Sensor/OV5642_DS.pdf&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://blog.arducam.com/downloads/shields/ArduCAM_Camera_Shield_Software_Application_Note.pdf&#34;&gt;https://blog.arducam.com/downloads/shields/ArduCAM_Camera_Shield_Software_Application_Note.pdf&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://www.uctronics.com/download/Amazon/B0067-B0068-Pico.pdf&#34;&gt;https://www.uctronics.com/download/Amazon/B0067-B0068-Pico.pdf&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;The code to drive this camera with the Pico is written in C (i.e., for Arduino) and CircuitPython, and can be found here: &lt;a href=&#34;https://github.com/ArduCAM/PICO_SPI_CAM&#34;&gt;https://github.com/ArduCAM/PICO_SPI_CAM&lt;/a&gt;. You can download all the files you need from this repository.&#xA;    In particular, the files for CircuitPython are in the &lt;code&gt;Python&lt;/code&gt; directory:&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;&lt;code&gt;Arducam.py&lt;/code&gt;, which contains the functions to drive the camera,&lt;/li&gt;&#xA;    &lt;li&gt;&lt;code&gt;OV5642_reg.py&lt;/code&gt; which, as far as I have understood, defines the resolutions  allowed  when taking pictures, and&lt;/li&gt;&#xA;    &lt;li&gt;&lt;code&gt;boot.py&lt;/code&gt; , but I don&amp;rsquo;t know what it does.&#xA;    In the same directory there is also the script &lt;code&gt;ArduCAM_Mini_5MP_Plus_VideoStreaming.py&lt;/code&gt; which sends the video captured by the camera to a Windows application and that is of no use here.&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;Please write a basic CircuitPython script that uses the functions defined in &lt;code&gt;Arducam.py&lt;/code&gt; to test that the camera works and can take photos.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;These two better-crafted prompts allowed &lt;code&gt;Gemini 3 Flash&lt;/code&gt; to quickly generate a CircuitPython script that runs directly on the Pico and performs all the requested tasks.&lt;/p&gt;&#xA;&lt;p&gt;Actually, the very first version of the script didn&amp;rsquo;t work. But Gemini was able to modify just a single line in the &lt;code&gt;Arducam.py&lt;/code&gt; driver provided by the manufacturer to fix it (specifically, it removed a zero from the frequency value).&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/one-line-diff.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/one-line-diff.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;The line highlighted in blue shows the only change that Gemini made to the &lt;code&gt;Arducam.py&lt;/code&gt; file; the original file is on the left, and the Gemini-modified version is on the right.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;All of other files provided by the manufacturer remained intact.&lt;/p&gt;&#xA;&lt;p&gt;As it turned out, the manufacturer&amp;rsquo;s driver worked perfectly with the Pico. Gemini simply learned how to use it to write the script that captures the image directly from the microcontroller (with a bit of patience, I could have done that myself).&lt;/p&gt;&#xA;&lt;h4 id=&#34;guide-the-llm-or-let-it-run-on-its-own&#34;&gt;Guide the LLM or let it run on its own?&lt;/h4&gt;&#xA;&lt;p&gt;The difference with respect to the &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;previous approach&lt;/a&gt; based on &lt;em&gt;vibe coding&lt;/em&gt; is striking.&lt;/p&gt;&#xA;&lt;p&gt;Then. Gemini worked frantically for hours and created such a mess that I was forced to restart my Mac twice. Meanwhile, it had convinced itself (and me, too) that the &lt;code&gt;Arducam.py&lt;/code&gt; driver was incompatible with the Pico and thus rewrote it from scratch, discarding an essential library for communicating with the camera (the library marked by the red arrow),&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-libraries.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-libraries.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;Changes to the first few lines of the &lt;code&gt;Arducam.py&lt;/code&gt; file; original file on the left, Gemini&amp;rsquo;s modified version is on the right. The arrow marks the library that was removed by Gemini.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;and removing most of the functions defined in that file for no reason and without any real gain in efficiency or functionality (quite the opposite, actually!). The image below shows the difference in the length of the original code, on the left, compared to the modified version, on the right.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-file-length.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-file-length.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;Changes to the &lt;code&gt;Arducam.py&lt;/code&gt; file. The red rectangles highlight the difference in the number of lines between the original code (left) and Gemini&amp;rsquo;s modified version (right).&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;Not satisfied with that, Gemini also completely altered the &lt;code&gt;OV5642_reg.py&lt;/code&gt; file containing the camera configuration data, claiming  memory issues. These problems evidently didn&amp;rsquo;t exist, given that everything went smoothly in this second experiment using the manufacturer&amp;rsquo;s original file.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-registers.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-registers.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;Changes to the &lt;code&gt;OV5642_reg.py&lt;/code&gt; file, containing the camera configuration data. The original configuration (left) was overridden by Gemini (right).&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;what-about-arduino&#34;&gt;What About Arduino?&lt;/h4&gt;&#xA;&lt;p&gt;At this point, it was inevitable to ask Gemini to generate the Arduino driver, using the same information that I had already provided for CircuitPython.&lt;/p&gt;&#xA;&lt;p&gt;At this point, it was inevitable that I would ask Gemini to generate the Arduino driver using the same information that I had already provided for CircuitPython.&lt;/p&gt;&#xA;&lt;p&gt;Gemini struggled more this time, mostly because I had forced it to make only minimal changes to the original Arducam libraries. However, this constraint was impossible to satisfy since the libraries do not natively support the Raspberry Pi Pico.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34; open&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Response&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;strong&gt;Researching Arducam Library Compatibility&lt;/strong&gt;&#xA;    I&amp;rsquo;ve thoroughly inspected both the original &lt;code&gt;ArduCAM&lt;/code&gt; library and the &lt;code&gt;Arducam_mini&lt;/code&gt; library. Neither version contains native support for the RP2040/RP2350 (Pico) architecture. Specifically, both libraries are missing the necessary definitions for &lt;code&gt;regtype&lt;/code&gt;, &lt;code&gt;regsize&lt;/code&gt;, &lt;code&gt;cbi&lt;/code&gt;, and &lt;code&gt;sbi&lt;/code&gt; for this platform, which is the root cause of the compilation errors.&#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Once the constraint was relaxed, things went smoothly, and in a short time, the driver modified specifically for the Pico and the related image acquisition program were ready.&lt;/p&gt;&#xA;&lt;h4 id=&#34;free-or-paid&#34;&gt;Free or paid?&lt;/h4&gt;&#xA;&lt;p&gt;Many argue that &lt;a href=&#34;https://www.oneusefulthing.org/p/a-guide-to-which-ai-to-use-in-the&#34;&gt;paid models are much better than free ones&lt;/a&gt;. Consequently, my &lt;em&gt;vibe coding&lt;/em&gt; problems would supposedly have disappeared if I had paid for Gemini or Claude.&lt;/p&gt;&#xA;&lt;p&gt;That&amp;rsquo;s not exactly how it went. During my testing, I mainly used the free version of Gemini to see what would happen to a non-professional user playing with these tools. However, when I ran out of tokens, I switched to the paid version of Gemini provided by the university, but I never noticed any significant differences. I can&amp;rsquo;t say much about Claude, because paying 200 dollars a month to use it seriously makes no sense for someone like me, who isn&amp;rsquo;t a professional programmer.&lt;/p&gt;&#xA;&lt;p&gt;What I did notice was that Claude Opus and Claude Sonnet were much better than Gemini at solving the problems that arose during code development. However, since I only used them once the work was already in progress, they might have just been helped by what Gemini had already done.&lt;/p&gt;&#xA;&lt;h4 id=&#34;conclusions&#34;&gt;Conclusions&lt;/h4&gt;&#xA;&lt;p&gt;It goes without saying that neither this experiment nor &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;the previous one&lt;/a&gt; are &lt;em&gt;scientific&lt;/em&gt;. For them to be considered as such, I would need to use different types of microcontrollers and modules, define a well-defined set of prompts, and measure how the agent responds to changing experimental conditions.&lt;/p&gt;&#xA;&lt;p&gt;Despite this, the experiment has one merit: it is fully &lt;em&gt;reproducible&lt;/em&gt;. Anyone can repeat the experiment using the same low-cost hardware and prompts to verify whether my findings are correct.&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;This is something that the &lt;em&gt;vibe coding&lt;/em&gt; apostles with tens of millions of views always carefully avoid proposing. Their posts are full of certainties and bold predictions yet they never provide any evidence to support their claims.&lt;/p&gt;&#xA;&lt;p&gt;Today, no one can deny that LLMs can be a great help during programming. However, it’s hard to believe that they can truly do so without any human intervention. At least until those apostles decide to provide us with clear, repeatable data to support their claims.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;Although I know a little bit about programming, I had no idea how to develop a driver and didn&amp;rsquo;t want to spend days learning how to do it.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;Actually, this file is a library of functions. However, since microcontrollers don&amp;rsquo;t have an operating system, these functions access the hardware directly, just like the drivers that control the components of a computer. Therefore, in this case, the terms &lt;em&gt;library&lt;/em&gt; and &lt;em&gt;driver&lt;/em&gt; are interchangeable.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;The term &lt;em&gt;low cost&lt;/em&gt; must now be taken with a grain of salt. The shortage of electronic components has caused the price of the Pico to skyrocket, and even more so for the Arducam Mini 5MP Plus. The latter, which cost 40 dollars a few months ago, now costs more than triple that on Amazon or AliExpress. Fortunately, some retailers are maintaining more reasonable prices, at least for now.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>The Dory effect</title>
      <link>https://static.233.196.69.159.clients.your-server.de/en/2026/03/09/the-dory-effect/</link>
      <pubDate>Mon, 09 Mar 2026 18:00:00 +0000</pubDate>
      <guid>https://static.233.196.69.159.clients.your-server.de/en/2026/03/09/the-dory-effect/</guid>
      <description>&lt;p&gt;I am well aware that LLMs have &lt;a href=&#34;https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/&#34;&gt;poor memory&lt;/a&gt;, but I never imagined that I would suffer the consequences so quickly.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-09-effetto-dory/Gemini_Generated_Image_q8d3k9q8d3k9q8d3.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Immagine generata da &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Note to the reader.&lt;/strong&gt; This article complements the previous one, &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;Antigravity: a driver written by AI&lt;/a&gt;, and should be read afterward. However, here&amp;rsquo;s a brief recap for the lazy readers.&lt;/p&gt;&#xA;&lt;p&gt;&lt;em&gt;Not knowing how to write a driver for a &lt;a href=&#34;https://en.wikipedia.org/wiki/Serial_Peripheral_Interface&#34;&gt;SPI camera&lt;/a&gt; attached to a Raspberry Pi Pico, I asked Gemini 3 Flash for help, working entirely inside Antigravity. After many hours, the agent managed to produce a working Arduino program. Then I made a fatal mistake by asking for even more without taking any precautions.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;premise&#34;&gt;Premise&lt;/h4&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://melabit.com/en/2026/03/04/https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/#ai-at-work&#34;&gt;working code&lt;/a&gt; generated with such great effort by &lt;code&gt;Gemini 3 Flash&lt;/code&gt; assumed the Pico acted as a bridge between my Mac Mini and the SPI camera.&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; On the Mac, a Python script commanded image capture, while the actual acquisition was performed by the Pico, running an Arduino program (a &lt;em&gt;sketch&lt;/em&gt; in platform jargon) that is essentially C/C++ code adapted for microcontrollers.&lt;/p&gt;&#xA;&lt;p&gt;Controlling everything from a PC works fine for early tests, but it becomes cumbersome when you want to use such a setup for fieldwork.&lt;/p&gt;&#xA;&lt;h4 id=&#34;automatic-captures&#34;&gt;Automatic captures&lt;/h4&gt;&#xA;&lt;p&gt;I asked Gemini to modify the code so it would automatically snap an image every N minutes or when a button is pressed, then send the picture to a computer over Wi‑Fi.&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;[...] However, this means that I need to use a Pico for the capture and another computer, even a Raspberry Pi Zero for commanding the capture process. Would it be possible to do everything on the Pico, eg press a button and perform the capture or via a timed process (defined in the .ino file itself) every N minutes (but there should be a mechanism that sends the images to a computer connected by WiFi, avoiding to fill the pico storage space).&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Gemini jumps straight into the work, still using Arduino. In principle, it should be a trivial task, but no matter how hard Gemini tries, the code never works. After many fruitless attempts I got fed up and asked it to revert to the last known‑good version of the program.&lt;/p&gt;&#xA;&lt;h4 id=&#34;backtracking&#34;&gt;Backtracking&lt;/h4&gt;&#xA;&lt;p&gt;And here Gemini hits an unexpected wall because, despite my guidance and constant suggestions, there is absolutely no way to succeed.&lt;/p&gt;&#xA;&lt;p&gt;Stupidly, I hadn&amp;rsquo;t kept track of the various code versions as they were generated, and that&amp;rsquo;s entirely my fault.&lt;/p&gt;&#xA;&lt;p&gt;On the other hand, even though I am well aware that large language models (LLMs) have a &lt;strong&gt;working memory&lt;/strong&gt; that resembles &lt;a href=&#34;https://youtu.be/3JNLwlcPBPI?si=lBu7revGCsYQ-nJY&amp;amp;t=62&#34;&gt;Dory&amp;rsquo;s, the blue tang fish&lt;/a&gt;, I would have never imagined that these ultra-sophisticated systems couldn&amp;rsquo;t even &lt;strong&gt;retrospectively reconstruct&lt;/strong&gt; what they&amp;rsquo;ve already done. Despite having dozens upon dozens of &lt;code&gt;implementation_plan.md&lt;/code&gt;, &lt;code&gt;task.md&lt;/code&gt;, and &lt;code&gt;walkthrough.md&lt;/code&gt; files that detail every previous step. I counted them: for this project alone, Antigravity produced over 250 distinct files.&lt;/p&gt;&#xA;&lt;p&gt;Going back to a prior state is something that tools like &lt;code&gt;Git&lt;/code&gt; and its predecessors &amp;ndash; tools that are entirely devoid of any &lt;em&gt;intelligence&lt;/em&gt; beyond that of their creators &amp;ndash; have been doing flawlessly for &lt;a href=&#34;https://blog.tarynmcmillan.com/a-history-of-version-control&#34;&gt;more than fifty years&lt;/a&gt;). Maybe developers of these &lt;em&gt;intelligent&lt;/em&gt; agents could give it a thought, huh?&lt;/p&gt;&#xA;&lt;h4 id=&#34;epilogue&#34;&gt;Epilogue&lt;/h4&gt;&#xA;&lt;p&gt;In the end, stubbornly relying on my notes,&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt; I managed to get Gemini to repeat everything it had already done. It took hours, it reproduced all the earlier mistakes, but eventually the rebuilt code worked again.&lt;/p&gt;&#xA;&lt;p&gt;While I was at it, I also had Gemini redo the whole thing for CircuitPython, and that port required many other hours of work and errors. And before getting there, Gemini caused such severe issues that macOS had to disable USB communication with the Pico, forcing me to restart the Mac a couple of times.&lt;/p&gt;&#xA;&lt;p&gt;Needless to say, this time I had it keep track of every step with &lt;code&gt;git&lt;/code&gt; and I also put all the code on &lt;a href=&#34;https://github.com/sabinomaggi/pico-camera5mp&#34;&gt;GitHub&lt;/a&gt;, just in case.&lt;/p&gt;&#xA;&lt;p&gt;One of the next steps will (hopefully) be MicroPython, but that&amp;rsquo;s another story.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;Why does Arduino struggle so much, despite being a far more mature microcontroller development platform than MicroPython or CircuitPython? Because the camera module I chose isn&amp;rsquo;t supported on the Raspberry Pi Pico; it only works with &lt;em&gt;classic&lt;/em&gt; Raspberry Pis (those running a full OS, such as the Pi 4, 5, or Zero), as well as with the Arduino and ESP286 families of microcontrollers.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;English isn&amp;rsquo;t great here; it&amp;rsquo;s more a stream of thoughts than a polished discourse. But I prefer jotting down whatever comes to mind while writing rather than losing the thread of ideas.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;I&amp;rsquo;ll never stress enough how important it is to keep written notes of what you&amp;rsquo;re doing. At the beginning it seems like a waste of time, yet it&amp;rsquo;s the best way to avoid losing a huge amount of time later.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>Antigravity: a driver written by AI</title>
      <link>https://static.233.196.69.159.clients.your-server.de/en/2026/03/04/antigravity-a-driver-written-by-ai/</link>
      <pubDate>Wed, 04 Mar 2026 18:00:00 +0000</pubDate>
      <guid>https://static.233.196.69.159.clients.your-server.de/en/2026/03/04/antigravity-a-driver-written-by-ai/</guid>
      <description>&lt;p&gt;Among all the Raspberry Pi and Arduino boards I am spending my days with, my favorite is the &lt;a href=&#34;https://www.raspberrypi.com/products/raspberry-pi-pico/&#34;&gt;Raspberry Pi Pico&lt;/a&gt;, a small yet powerful &lt;a href=&#34;https://en.wikipedia.org/wiki/Microcontroller&#34;&gt;microcontroller&lt;/a&gt; that can be programmed not only in C/C++ via the Arduino IDE, but also in &lt;a href=&#34;https://www.raspberrypi.com/documentation/microcontrollers/micropython.html&#34;&gt;MicroPython&lt;/a&gt; and &lt;a href=&#34;https://learn.adafruit.com/getting-started-with-raspberry-pi-pico-circuitpython/&#34;&gt;CircuitPython&lt;/a&gt;, two competing Python variants for microcontrollers.&lt;/p&gt;&#xA;&lt;p&gt;Unlike the other Raspberry Pi models, the Pico does not have a &lt;a href=&#34;https://en.wikipedia.org/wiki/Camera_Serial_Interface&#34;&gt;dedicated camera interface&lt;/a&gt;, but it can use cameras that communicate over an &lt;a href=&#34;https://en.wikipedia.org/wiki/Serial_Peripheral_Interface&#34;&gt;SPI interface&lt;/a&gt;,&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; such as the &lt;a href=&#34;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html&#34;&gt;Arducam Mini 5MP Plus&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;In theory, this shouldn&amp;rsquo;t be a hard job. After glancing at the &lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Pico/Camera-Module/SPI-Camera/#1introduction&#34;&gt;official documentation&lt;/a&gt;, it should just be a matter of wiring the camera to the Pico, pulling the driver and test programs from the manufacturer&amp;rsquo;s &lt;a href=&#34;https://github.com/ArduCAM/PICO_SPI_CAM&#34;&gt;GitHub repository&lt;/a&gt;, copying them onto the Pico after installing the supported Python version, and finally trying to snap a few pictures.&lt;/p&gt;&#xA;&lt;p&gt;I couldn&amp;rsquo;t have been more wrong.&lt;/p&gt;&#xA;&lt;h4 id=&#34;we-need-a-driver&#34;&gt;We need a driver&lt;/h4&gt;&#xA;&lt;p&gt;The only two test programs available, one for Arduino and one for CircuitPython, assume you&amp;rsquo;ll view the video output on Windows, which is odd because most people working with these devices use Linux or macOS. Moreover, CircuitPython is less common than MicroPython, and I&amp;rsquo;d have preferred the latter for compatibility with the other components already connected to the Pico.&lt;/p&gt;&#xA;&lt;p&gt;With patience, and armed with a reference program that shows the main functions for handling the device, you could try translating the CircuitPython code to MicroPython (or vice‑versa),&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt; but there was nothing like that available.&lt;/p&gt;&#xA;&lt;p&gt;Other guides I found online looked promising, yet they didn&amp;rsquo;t work in practice. The only option left was to bypass the official tools and write a driver for the camera.&lt;/p&gt;&#xA;&lt;p&gt;Easier said than done!&lt;/p&gt;&#xA;&lt;p&gt;I know how to write code to solve differential equations or analyse measurement data. I&amp;rsquo;m comfortable with micro‑controller programming and sensor data acquisition. But this is a completely different beast. It requires digging into the low‑level communication between the Pico and the camera, a job for specialist engineers who write the drivers that let us &lt;em&gt;humans&lt;/em&gt; perform complex operations (like taking a photo) with just a few lines of Python code.&lt;/p&gt;&#xA;&lt;p&gt;So why not let one of the agents built into Antigravity do what I&amp;rsquo;m not able to do?&lt;/p&gt;&#xA;&lt;div style = &#34;border-style: solid; border-width: 0px 0px 0px 12px; border-color: #03a87c; background-color: #defef6; padding: 1em; color:black; font-size:100%;&#34;&gt;&#xA;&lt;p&gt;I didn&amp;rsquo;t have to stick with Antigravity, Claude Code, Google Gemini, DeepSeek, or any of the countless other LLMs at our disposal could have easily handled the task. However, as I&amp;rsquo;ve &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/&#34;&gt;already written&lt;/a&gt;, using an Antigravity-integrated agent lets you work directly in the editor and access the underlying OS, a huge advantage when you need to keep iterating on the generated code.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/br&gt;&#xA;&lt;h4 id=&#34;ai-at-work&#34;&gt;AI at work&lt;/h4&gt;&#xA;&lt;p&gt;When you read countless enthusiastic posts about &lt;em&gt;vibe coding&lt;/em&gt; &amp;ndash; the notion that AI can develop software with little to no human programmer intervention &amp;ndash; it looks very easy (read for example, this &lt;a href=&#34;https://shumer.dev/something-big-is-happening&#34;&gt;post&lt;/a&gt; that amassed millions of reads, or &lt;a href=&#34;https://www.oneusefulthing.org/p/claude-code-and-what-comes-next&#34;&gt;this one&lt;/a&gt;; not to mention the tweet that &lt;a href=&#34;https://x.com/karpathy/status/1886192184808149383&#34;&gt;started it all&lt;/a&gt;).&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt; In summary, you write a couple of sentences asking the LLM or the current &lt;em&gt;agent&lt;/em&gt; to do something, you go have lunch or take a walk, and when you come back everything is ready.&lt;sup id=&#34;fnref:4&#34;&gt;&lt;a href=&#34;#fn:4&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;4&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;Let&amp;rsquo;s try doing the same with &lt;code&gt;Gemini&lt;/code&gt;, giving it only the essential information and checking what it can actually do. In this case, this information boils down to the link to the camera&amp;rsquo;s web page and a table describing how the camera pins are connected to the Pico (it&amp;rsquo;s more than three lines, I know, but I&amp;rsquo;m a bit verbose).&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;I have a Rasperry Pi Pico 2W connected to a Arducam 5MP Plus OV5642 Mini Module Camera Shield (&lt;a href=&#34;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html%29&#34;&gt;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html)&lt;/a&gt;. I have connected the camera to the Pico using the standard layout found in the documentation:&lt;/p&gt;&#xA;    &lt;table&gt;&#xA;    &lt;tbody&gt;&#xA;      &lt;tr&gt;&#xA;        &lt;td&gt;Camera&lt;/th&gt;&#xA;        &lt;td&gt;CS&lt;/th&gt;&#xA;        &lt;td&gt;MOSI&lt;/th&gt;&#xA;        &lt;td&gt;MISO&lt;/th&gt;&#xA;        &lt;td&gt;SCK&lt;/th&gt;&#xA;        &lt;td&gt;GND&lt;/th&gt;&#xA;        &lt;td&gt;VCC&lt;/th&gt;&#xA;        &lt;td&gt;SDA&lt;/th&gt;&#xA;        &lt;td&gt;SCL&lt;/th&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;        &lt;td&gt;Pico&lt;/td&gt;&#xA;        &lt;td&gt;GP5&lt;/td&gt;&#xA;        &lt;td&gt;GP3&lt;/td&gt;&#xA;        &lt;td&gt;GP4&lt;/td&gt;&#xA;        &lt;td&gt;GP2&lt;/td&gt;&#xA;        &lt;td&gt;GND&lt;/td&gt;&#xA;        &lt;td&gt;3V3&lt;/td&gt;&#xA;        &lt;td&gt;GP8&lt;/td&gt;&#xA;        &lt;td&gt;GP9&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;    &lt;/tbody&gt;&#xA;    &lt;/table&gt;&#xA;    &lt;p&gt;Please write a basic Micropython code to test that everything is right and that the camera can take photos&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Gemini attacks the problem furiously, spits out MicroPython code that doesn&amp;rsquo;t run, tries to fix it based on my hints, and inevitably introduces new errors (I keep saying &lt;code&gt;Gemini&lt;/code&gt; for brevity, but I&amp;rsquo;ve also tried &lt;code&gt;Gemini 3.1 Pro&lt;/code&gt;, &lt;code&gt;Claude Opus 4.6&lt;/code&gt;, and &lt;code&gt;Claude Sonnet 4.6&lt;/code&gt;, as my tokens ran out).&lt;/p&gt;&#xA;&lt;p&gt;At some point, completely stumped, Gemini starts suspecting a hardware issue, such as a bad connection or a defective camera (but probing the individual pins quickly finds that everything is wired correctly).&lt;/p&gt;&#xA;&lt;p&gt;After hours and hours of work, and because it can&amp;rsquo;t produce anything decent with MicroPython, I ask it to switch to Arduino, a more mature software platform. More hours later, stubbornly, it succeeds, and the image below is the very first picture ever taken with the Arduino code generated by Gemini (the original image was upside-down; I simply rotated it 180°).&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-04-antigravity-un-driver-scritto-dall-ia/capture.jpeg&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-04-antigravity-un-driver-scritto-dall-ia/capture_small.jpeg&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Soon after, I make a serious mistake that forced me, or better &lt;em&gt;it&lt;/em&gt;, to start from scratch. But that&amp;rsquo;s a story for another post.&lt;/p&gt;&#xA;&lt;h4 id=&#34;epilogue&#34;&gt;Epilogue&lt;/h4&gt;&#xA;&lt;p&gt;Once I had a working Arduino program, I asked Gemini to redo everything for CircuitPython. This port also takes several hours of work, interspersed with a couple of forced restarts of the Mac which, after some nasty CircuitPython bug, was forced to disable communication between the USB ports and the Pico.&lt;/p&gt;&#xA;&lt;p&gt;In theory I should repeat the cycle a third time for MicroPython. But the thought of wasting even more time watching Gemini work makes me shiver, so I shelved that idea, at least for now.&lt;/p&gt;&#xA;&lt;h4 id=&#34;conclusions&#34;&gt;Conclusions&lt;/h4&gt;&#xA;&lt;p&gt;I don&amp;rsquo;t know what those who write countless articles proclaiming that anyone can have AI write a perfectly functioning program in no time, even if they don&amp;rsquo;t know a thing about programming, are using.&lt;/p&gt;&#xA;&lt;p&gt;Even though I know something about programming, I&amp;rsquo;ve only succeeded with the simplest experiments described in the two &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/&#34;&gt;previous&lt;/a&gt; &lt;a href=&#34;https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/&#34;&gt;articles&lt;/a&gt;. And it certainly didn&amp;rsquo;t happen here. It is worth noting, however, that I&amp;rsquo;m in &lt;a href=&#34;https://vladimir.varank.in/notes/2026/02/freebsd-brcmfmac/&#34;&gt;good company&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-03-04-antigravity-un-driver-scritto-dall-ia/Gemini_Generated_Image_attiipattiipatti.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Immagine generata da &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;True, following the &lt;em&gt;vibe coding&lt;/em&gt; mantra to the letter, I didn&amp;rsquo;t write a single line of code. I even let Gemini execute terminal commands that I could type blindfolded. But it&amp;rsquo;s equally true that without my constant dialogue, without guiding the agent step-by-step, without correcting its countless, and often trivial, mistakes, Gemini would have gone nowhere.&lt;/p&gt;&#xA;&lt;p&gt;And it took many hours of uninterrupted work, too many hours in my humble opinion, to achieve something usable.&lt;/p&gt;&#xA;&lt;p&gt;Forget speedy, intuitive programming. Forget coding based on &lt;a href=&#34;https://www.youtube.com/watch?v=apBWI6xrbLY&#34;&gt;good vibrations&lt;/a&gt; from the 1960s. Forget programming &lt;em&gt;for dummies&lt;/em&gt;!&lt;sup id=&#34;fnref:5&#34;&gt;&lt;a href=&#34;#fn:5&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;5&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;If you don&amp;rsquo;t roll up your sleeves and get your hands dirty with the code, even code produced by a more or less intelligent agent, you won&amp;rsquo;t get you anywhere. I&amp;rsquo;d love to be proven wrong.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;The SPI (Serial Peripheral Interface) is a fast and reliable serial communication protocol widely used to connect a microcontroller (&lt;em&gt;master&lt;/em&gt;) to one or more peripheral devices (&lt;em&gt;slave&lt;/em&gt;).&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;While writing the article I found &lt;a href=&#34;https://www.youtube.com/watch?v=l254lxm78I4&#34;&gt;this video&lt;/a&gt; that explains how to mix MicroPython and CircuitPython code in the same script. If it works for more complex tasks than toggling an LED, it would be a real breakthrough.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;Is it a coincidence that they are always very long posts but never contain a shred of evidence, a prompt, a few lines of code, or links to support the many unverified (and unverifiable) claims?&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:4&#34;&gt;&#xA;&lt;p&gt;Being true nerds, no one ever says they&amp;rsquo;re going to have sex.&amp;#160;&lt;a href=&#34;#fnref:4&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:5&#34;&gt;&#xA;&lt;p&gt;From the title of the famous &lt;a href=&#34;https://en.wikipedia.org/wiki/For_Dummies&#34;&gt;For Dummies&lt;/a&gt; book series, covering everything from using DOS to playing chess.&amp;#160;&lt;a href=&#34;#fnref:5&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>Antigravity: from surprise to doubt</title>
      <link>https://static.233.196.69.159.clients.your-server.de/en/2026/02/16/antigravity-from-surprise-to-doubt/</link>
      <pubDate>Mon, 16 Feb 2026 18:00:00 +0000</pubDate>
      <guid>https://static.233.196.69.159.clients.your-server.de/en/2026/02/16/antigravity-from-surprise-to-doubt/</guid>
      <description>&lt;p&gt;As effective as Antigravity may be, digging a little deeper reveals that the agent-based systems working inside it, while helpful and capable at answering &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/&#34;&gt;many complex questions&lt;/a&gt;, are not exempt from the usual issues of the large language models (LLMs) we&amp;rsquo;ve been dealing with for the past three years.&lt;/p&gt;&#xA;&lt;p&gt;Let&amp;rsquo;s look at some practical examples.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/Gemini_Generated_Image_vzsj3tvzsj3tvzsj.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Image generated by &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;example-1&#34;&gt;Example #1&lt;/h4&gt;&#xA;&lt;p&gt;After successfully trying to extract data from &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#task-1-extracting-data-from-images&#34;&gt;images in one of my articles&lt;/a&gt;, I immediately got to work trying to apply the same method to other figures similar to those analyzed the first time, such as &lt;a href=&#34;https://www.sciencedirect.com/science/article/pii/S0341816225007738&#34;&gt;this one&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/1-s2.0-S0341816225007738-gr4_lrg.jpg&#34; alt=&#34;&#34;&gt;&lt;p&gt;The figure is clearly more complex than my previous ones: the top part features a histogram showing monthly precipitation and snow depth measurements for two locations with unpronounceable names in Québec, where different measurements are represented with varying shades and colors. Below it, and partially overlapping the histogram, there is a line graph with the average monthly temperatures for the two locations, with the vertical scale positioned on the right side.&lt;/p&gt;&#xA;&lt;p&gt;Even though the figure is more complex, I never expected &lt;code&gt;Gemini 3 Flash&lt;/code&gt; to struggle so much. It tries, and tries again, but every time something is off. The histogram isn&amp;rsquo;t a huge problem, but there&amp;rsquo;s no way to make it understand what to do with the data in the line graph. This issue persists even when I explain exactly where it should look to determine the temperature values for each month.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Hint: To evaluate temperatures for a given month you should take a point on the temperature plot that is between the two precipitation bars &#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;I am truly surprised by Gemini&amp;rsquo;s difficulty in repeating and extending the results of &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#task-1-extracting-data-from-images&#34;&gt;my first test&lt;/a&gt;, even as it continues to reassure me, claiming it has &lt;em&gt;understood&lt;/em&gt; perfectly what I want and has performed a &lt;em&gt;meticulous&lt;/em&gt; job.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34; open&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Answer&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;I have completed a meticulous re-evaluation of Figure 4 using the &#34;midpoint&#34; method you suggested (reading the temperature at the horizontal center between the two monthly precipitation bars).&#xA;Final Corrections Applied: [...]&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Actually, it seems that Gemini has no idea what the &amp;ldquo;midpoint&amp;rdquo; method is and is just repeating what I say, changing only a few words. If it were otherwise, it would have resolved the issue in seconds, pulling out the right data just as it did the first time.&lt;/p&gt;&#xA;&lt;p&gt;In the end, I give up. It makes no sense to waste hours making an &lt;em&gt;intelligent agent&lt;/em&gt; do a job that would only take a few minutes by hand.&lt;/p&gt;&#xA;&lt;h4 id=&#34;example-2&#34;&gt;Example #2&lt;/h4&gt;&#xA;&lt;p&gt;I must admit I&amp;rsquo;m disappointed, but perhaps I expected too much from my intelligent agent.&lt;/p&gt;&#xA;&lt;p&gt;So, I try to make things easier by choosing a graph from an &lt;a href=&#34;https://pubs.aip.org/aip/apl/article-abstract/86/4/042501/117168/Overdamped-Nb-Al-AlOx-Nb-Josephson-junctions&#34;&gt;old article of mine&lt;/a&gt; that shouldn&amp;rsquo;t be too challenging for Gemini 3 Flash.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/042501_fig2.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Instead, it gets stuck and makes incomprehensible mistakes. With a little help, he manages to pull out the data shown in this graph, which still has several missing points (highlighted by the colored boxes) or, even worse, points added quite imaginatively (such as the one indicated by the arrow).&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/recreate_figure_2.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Another disappointment. Why use an &lt;em&gt;intelligent agent&lt;/em&gt; if it&amp;rsquo;s faster to acquire the points by hand?&lt;/p&gt;&#xA;&lt;h4 id=&#34;example-3&#34;&gt;Example #3&lt;/h4&gt;&#xA;&lt;p&gt;But I don&amp;rsquo;t give up easily and try once more to get it to do something good. I use a figure from &lt;a href=&#34;https://pubs.aip.org/aip/apl/article-abstract/91/25/252505/334515/Nb-Al-AlOx-Nb-overdamped-Josephson-junctions-above&#34;&gt;another article&lt;/a&gt;, which is quite simple from a graphical point of view. However, it would be useful to be able to acquire it automatically, as the amount of data it contains makes it difficult to digitize accurately by hand.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_fig1.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Burned by the previous experiences, I first ask it to describe what it &lt;em&gt;sees&lt;/em&gt; in the figure.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Now could you describe in simple words what you &#34;see&#34; in figure 1?&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;The response is long, precise, and almost flawless,&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34;&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Answer&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;Looking at figure 252505_1_f1.jpeg, I can see:&lt;/p&gt;&#xA;    &lt;p&gt;&lt;strong&gt;Two side-by-side graphs&lt;/strong&gt; showing current-voltage (I-V) characteristics:&lt;/p&gt;&#xA;    &lt;p&gt;&lt;strong&gt;(a) Left graph (T = 2.0 K)&lt;/strong&gt;:&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;Shows a &lt;strong&gt;hysteretic&lt;/strong&gt; behavior - the curve forms a loop&lt;/li&gt;&#xA;    &lt;li&gt;The current jumps abruptly at certain voltage points&lt;/li&gt;&#xA;    &lt;li&gt;There&amp;rsquo;s a clear gap between the upward and downward voltage sweeps&lt;/li&gt;&#xA;    &lt;li&gt;The vertical jumps occur around ±1 mV&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;&lt;strong&gt;(b) Right graph (T = 5.8 K)&lt;/strong&gt;:&lt;/p&gt;&#xA;    &lt;p&gt;Shows &lt;strong&gt;non-hysteretic&lt;/strong&gt; behavior - smooth, continuous curve&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;No loop or gap between voltage sweeps&lt;/li&gt;&#xA;    &lt;li&gt;The transition is gradual rather than abrupt&lt;/li&gt;&#xA;    &lt;li&gt;The curve is more rounded at the critical points&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;&lt;strong&gt;Key observation&lt;/strong&gt;: This demonstrates how the same Josephson junction behaves differently at different temperatures. At the lower temperature (2.0 K), it shows hysteresis (memory effect), while at the higher temperature (5.8 K), the hysteresis disappears and the junction becomes &amp;ldquo;overdamped.&amp;rdquo;&lt;/p&gt;&#xA;    &lt;p&gt;Both graphs show current ranging from about -15 to +15 mA on the y-axis, with voltage on the x-axis measured in millivolts (mV).&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;but, in reality, there is an error at the end of the response that would render any attempt to digitise the data in the two graphs impossible.&lt;/p&gt;&#xA;&lt;p&gt;When I point it out, Gemini 3 Flash corrects itself immediately and seems almost sorry. But, as we all know, LLMs are programmed to always please the user.&lt;/p&gt;&#xA;&lt;p&gt;I move forward and try to help it out, explaining exactly what it should do.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;No problem. Now could you extract the values of voltage and current from the two images? Please produce a table for each image, and export the table as a csv file and a markdown file, with suffix `_table`.&lt;/br&gt;  &#xA;Since one of the I-V curves is hysteretic, I suggest this strategy for extracting the data: start from V=0, increase V up to the maximum value following one of the paths, decrease V to 0 following the other path, do the same for the negative values of V.&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;But despite the suggestion, there&amp;rsquo;s no way to get it to do a decent job. The best I get is this, which at first glance looks like a great result.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_lowres_v22.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;It&amp;rsquo;s a pity, though, that there&amp;rsquo;s a serious and incomprehensible error, which I&amp;rsquo;ll leave it for you to discover.&lt;/p&gt;&#xA;&lt;p&gt;In any case, the previous image is truly the best of the best of the best, and it was obtained after hours of failed attempts (like &lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_recreated.png&#34;&gt;this&lt;/a&gt; initial attempt, or &lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_lowres_v13.png&#34;&gt;this&lt;/a&gt;, or &lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_lowres_final.png&#34;&gt;this&lt;/a&gt; one) and of hints that went from vague to increasingly precise, much like with an unprepared student from whom you try to get a few words out. Not to mention responses that always tried to adapt to the current suggestion without ever, or almost ever, making use of what had been discussed previously.&lt;/p&gt;&#xA;&lt;p&gt;All these fruitless attempts were made even more frustrating by the fact that, several years ago, I had solved the same problem accurately and reproducibly with barely two hundred lines of Python code (&lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#fn:3&#34;&gt;I remembered wrong&lt;/a&gt;, it wasn&amp;rsquo;t C but Python).&lt;/p&gt;&#xA;&lt;p&gt;And when I finish explaining exactly how it should be done, even suggesting that the key is to convert the original &lt;code&gt;JPEG&lt;/code&gt; file into a &lt;code&gt;PBM&lt;/code&gt; format (which is basically a grid of 0s or 1s depending on whether the pixel is white or black),&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;My suggestion is to change focus completely. If I were you I would:&lt;/p&gt;&#xA;    &lt;ol&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;take note of the extreme values of voltage and current on the X and Y axes;&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;get rid of the extraneous information outside the plot area and of the text (if any) inside it;&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;convert the image to the PBM (or PGM) format, that gives you a clean grid of black and white (or gray and white) pixels;&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;remove the extraneous grid dots using a 3x3 or 5x5 filter (or anything like that);&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;calculate the X and Y position of each black pixel, converting the [values] to Voltage and Current (it is just a proportion).&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;/ol&gt;&#xA;    &lt;p&gt;Clearly you can change the order of the steps or do something different if you believe your approach is better than mine.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;there&amp;rsquo;s still no way to get it to do something useful.&lt;/p&gt;&#xA;&lt;p&gt;What&amp;rsquo;s even more frustrating is that Gemini didn&amp;rsquo;t hesitate to try to cheat! When it found the original file of measured values that I was using to compare its results with the &lt;em&gt;real&lt;/em&gt; data, it tried to use that file to generate the data I was asking for. Clearly, the results were perfect this time, but did I need an artificial &lt;em&gt;intelligence&lt;/em&gt; to achieve that?&lt;/p&gt;&#xA;&lt;h4 id=&#34;further-examples&#34;&gt;Further examples&lt;/h4&gt;&#xA;&lt;p&gt;During the days I spent testing what I&amp;rsquo;ve described in this post, I also ran many other tests, such as asking it to write some simple LaTeX code or convert the tables contained in a few PDF files into &lt;code&gt;CSV&lt;/code&gt; or &lt;code&gt;Markdown&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;p&gt;t handled LaTeX well, but often struggled with tables, even when the layout was simple. There&amp;rsquo;s no need to go into detail here, as there aren&amp;rsquo;t many differences compared to previous examples.&lt;/p&gt;&#xA;&lt;h4 id=&#34;conclusions&#34;&gt;Conclusions&lt;/h4&gt;&#xA;&lt;p&gt;After the excellent results &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#task-1-extracting-data-from-images&#34;&gt;described in the previous article&lt;/a&gt;, I was confident I could use the intelligent agents integrated into Antigravity to easily extract data buried in many scientific articles from a few years ago, which would have allowed me to perform new comparative analyses quickly and efficiently.&lt;/p&gt;&#xA;&lt;p&gt;Unfortunately, all the more in-depth tests described in this post turned out to be unsuccessful.&lt;/p&gt;&#xA;&lt;p&gt;The model used here, almost always Gemini 3 Flash (though I also briefly tried Gemini 3 Pro and Claude Sonnet 4.5),&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; showed &lt;strong&gt;inconsistent and unrepeatable behavior&lt;/strong&gt;. For example, it would use different methods to solve the same problem without it ever being clear whether the choice of an algorithm or library had a specific technical motivation or was just a matter of statistical probability in the generation of responses.&lt;/p&gt;&#xA;&lt;p&gt;Another serious issue was the &lt;strong&gt;lack of memory&lt;/strong&gt;, much like &lt;a href=&#34;https://youtu.be/3JNLwlcPBPI?si=AuWIQu80vuFst1KP&amp;amp;t=62&#34;&gt;Dory&lt;/a&gt;, the funny fish from the cartoons. Despite successfully solving the problem of extracting data from a histogram just a few days earlier, Gemini was unable to apply the same methodology to a similar case. On other occasions, once a problem was partially solved, Gemini would start working on it again from scratch instead of refining and improving the previous answer.&lt;/p&gt;&#xA;&lt;p&gt;But the problem that perhaps bothered me the most was the &lt;strong&gt;inability to generalize&lt;/strong&gt; the work already done. If you ask to extract data from an image, Gemini creates a program suited only for &lt;em&gt;that&lt;/em&gt; specific image, without ever trying to develop a general method applicable to other images of the same type. Similarly, if you ask it to generate a graph from a data file, Gemini creates a script suited only for &lt;em&gt;that&lt;/em&gt; file, which is nearly impossible to adapt to other similar cases.&lt;/p&gt;&#xA;&lt;p&gt;When humans tackle a problem, they tend to become fond of the solution and try to reuse and extend it to similar problems. An LLM doesn&amp;rsquo;t care; it reinvents the wheel at every new iteration. It seems the triumph of the throwaway culture, a waste of resources and intelligence (however artificial) taken to the extreme.&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;m just a guy playing around with these tools. However, very serious scientific studies &amp;ndash; which, for instance, tried to use LLMs to rediscover &lt;a href=&#34;https://arxiv.org/abs/2507.06952v4&#34;&gt;Newton&amp;rsquo;s law of gravitation&lt;/a&gt; starting from the orbits of planets around the Sun, or to predict the &lt;a href=&#34;https://pubs.rsc.org/en/content/articlelanding/2025/dd/d5dd00176e&#34;&gt;behavior of a molecule&lt;/a&gt; based on different representations of its structure &amp;ndash; have reached the exact same conclusions.&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;Our analysis reveals that these models instead behave as if&#xA;they develop task-specific heuristics that fail to generalize.&lt;/em&gt;&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&lt;/blockquote&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;Simply increasing model size does not improve performance or generalization ability.&lt;/em&gt;&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&lt;/blockquote&gt;&#xA;&lt;p&gt;Lastly, even if  at first glance it seems otherwise, we must never forget that LLMs &lt;strong&gt;don&amp;rsquo;t actually understand&lt;/strong&gt; what they are being asked; they are just very good at finding a needle in a very, very, very large haystack. An LLM can explain in great detail what a &amp;ldquo;Josephson junction I-V characteristic&amp;rdquo; is because it has &lt;em&gt;read&lt;/em&gt; many books and articles on the subject. But when it comes to connecting the ideal concept of &amp;ldquo;I-V characteristic&amp;rdquo; with its experimental representation in a figure, it fails miserably, proving it is just a well-trained parrot.&lt;/p&gt;&#xA;&lt;p&gt;Someone claims that the latest generation of LLMs can write an &lt;a href=&#34;https://arstechnica.com/ai/2026/02/sixteen-claude-ai-agents-working-together-created-a-new-c-compiler/&#34;&gt;entire C compiler&lt;/a&gt; on their own or tens of thousands of lines of &lt;a href=&#34;https://shumer.dev/something-big-is-happening&#34;&gt;perfectly functioning code&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Personally, I would be satisfied if these models could produce a data file or a table in a reproducible manner, without running into inexplicable errors or inconsistent results. They should also avoid awkward situations like the ones I experienced, where one day everything goes smoothly, and the next day, nothing works.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;I also used the paid version of Gemini, kindly provided by my university, without noticing any differences compared to the free version. Those who claim that paid models do things the free ones can only dream of should provide solid evidence before being believed.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;K. Vafa &lt;em&gt;et al.&lt;/em&gt;, &lt;a href=&#34;https://arxiv.org/abs/2507.06952v4&#34;&gt;What Has a Foundation Model Found? Using Inductive Bias to Probe for World Models&lt;/a&gt;, arXiv 2507.06952v4, 2025.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;B. Yan &lt;em&gt;et al.&lt;/em&gt;, &lt;a href=&#34;https://pubs.rsc.org/en/content/articlelanding/2025/dd/d5dd00176e&#34;&gt;Inconsistency of LLMs in molecular representations&lt;/a&gt;, Digital Discovery vol. 4, 2025.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>An unexpected Antigravity</title>
      <link>https://static.233.196.69.159.clients.your-server.de/en/2026/01/21/an-unexpected-antigravity/</link>
      <pubDate>Wed, 21 Jan 2026 18:00:00 +0000</pubDate>
      <guid>https://static.233.196.69.159.clients.your-server.de/en/2026/01/21/an-unexpected-antigravity/</guid>
      <description>&lt;p&gt;I confess, when I &lt;a href=&#34;https://melabit.com/en/2025/12/15/photocopied/&#34;&gt;started using Antigravity&lt;/a&gt; I had many doubts, because the new &lt;em&gt;revolutionary&lt;/em&gt; editor produced by Google seemed to me like just another clone of Microsoft&amp;rsquo;s &lt;a href=&#34;https://code.visualstudio.com/&#34;&gt;VS Code&lt;/a&gt;.&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;But as soon as I started using the &lt;em&gt;agentic&lt;/em&gt; features of Google Antigravity, I had to change my mind, because there is truly something good there.&lt;/p&gt;&#xA;&lt;p&gt;Of course, there are also many limitations, but these are shared by all other LLMs (&lt;em&gt;Large Language Models&lt;/em&gt;), which work well when guided step-by-step through the task at hand, but also lack imagination and struggle with new and slightly complicated problems.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/Gemini_Generated_Image_nu8qhynu8qhynu8q.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Image generated by &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;editor-and-ide&#34;&gt;Editor and IDE&lt;/h4&gt;&#xA;&lt;p&gt;Google Antigravity is an &lt;a href=&#34;https://en.wikipedia.org/wiki/Integrated_development_environment&#34;&gt;IDE (Integrated Development Environment)&lt;/a&gt;, that is, an editor with additional features that help write and debug code.&lt;/p&gt;&#xA;&lt;p&gt;Being a derivative of VS Code, it inherits all its positive and negative aspects, first and foremost the fact that it is based on &lt;a href=&#34;https://www.electronjs.org/&#34;&gt;Electron&lt;/a&gt;, a multi-platform framework that is very convenient for developers but also very demanding in terms of resources and disk space.&lt;/p&gt;&#xA;&lt;p&gt;In fact, as I already noted in my &lt;a href=&#34;https://melabit.com/en/2025/12/15/photocopied/&#34;&gt;last post&lt;/a&gt;, once installed, Google Antigravity occupies a whopping 700 MB on my Mac (696.6 MB to be precise), a hair more than VS Code itself (683.9 MB) and slightly less than its &lt;em&gt;father&lt;/em&gt; Windsurf (718.7 MB). For comparison, my two favourite editors, TextMate and BBEdit, occupy 40 MB and 65 MB respectively. &lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;There is little to say about the code editing and development support features. They are identical to those of VS Code and can be greatly extended through the 8,500+ extensions available on the &lt;a href=&#34;https://open-vsx.org/&#34;&gt;Open VSX Registry&lt;/a&gt;, a database of extensions specifically designed for VS Code-derived editors.&lt;/p&gt;&#xA;&lt;p&gt;In theory, one could use &lt;a href=&#34;https://marketplace.visualstudio.com/&#34;&gt;VS Code extensions&lt;/a&gt;, but Microsoft recently &lt;a href=&#34;https://freedium-mirror.cfd/https://medium.com/@S3CloudHub/microsoft-quietly-blocked-cursor-from-using-its-vscode-extension-heres-the-line-of-code-that-40d65f14ef0f&#34;&gt;blocked&lt;/a&gt; the use of the VS Code marketplace for all other derivative editors (including Cursor, Windsurf, or Antigravity). So much for &lt;a href=&#34;https://code.visualstudio.com/&#34;&gt;open-source&lt;/a&gt; principles&amp;hellip;&lt;/p&gt;&#xA;&lt;h4 id=&#34;agentic-functions&#34;&gt;&lt;em&gt;Agentic&lt;/em&gt; functions&lt;/h4&gt;&#xA;&lt;p&gt;So, what exactly is &lt;a href=&#34;https://www.ibm.com/think/topics/agentic-ai&#34;&gt;Agentic Artificial Intelligence&lt;/a&gt;? It is an AI system that can act autonomously and achieve complex goals with minimal human supervision, without needing detailed, well-defined instructions.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s a bit like the dream of amateur cooks: having a robot that can look at what&amp;rsquo;s in the fridge and cupboard and prepare a complete lunch all by itself, after we vaguely explain what we want.&lt;/p&gt;&#xA;&lt;p&gt;But it is also the dream of the amateur &lt;em&gt;vibe coders&lt;/em&gt;, those who would like to be able to ask the AI to program an application &amp;ldquo;like this and that,&amp;rdquo; go out for a pizza, and find everything ready upon return, including the porting for major platforms and an e-commerce site to sell it.&lt;/p&gt;&#xA;&lt;h4 id=&#34;putting-antigravity-to-the-test&#34;&gt;Putting Antigravity to the test&lt;/h4&gt;&#xA;&lt;p&gt;So far, all these experiments haven&amp;rsquo;t exactly &lt;a href=&#34;https://freedium-mirror.cfd/https://ninza7.medium.com/anthropic-gave-an-ai-a-job-the-results-were-hilarious-and-terrifying-e85a204b06f4&#34;&gt;looked great&lt;/a&gt; but, as &lt;a href=&#34;https://www.google.com/search?q=https://it.wikipedia.org/wiki/Mai_dire_mai_%28film%29&#34;&gt;Sean Connery teaches us&lt;/a&gt;, never say never. So let&amp;rsquo;s see how the agentic functions integrated into Google Antigravity work in practice.&lt;/p&gt;&#xA;&lt;p&gt;To test them, I asked the program to perform a few specific tasks, ranging from simple things to a particularly complex task. I tried to to leave it free to decide what to do and to intervene only when it was truly necessary. Although I chose tasks that were relevant to my own research, I believe the results are significant nonetheless.&lt;/p&gt;&#xA;&lt;p&gt;Here is how it went.&lt;/p&gt;&#xA;&lt;h5 id=&#34;before-starting&#34;&gt;Before starting&lt;/h5&gt;&#xA;&lt;p&gt;Before starting, I pretended that I was using Antigravity for the first time, and deleted all the files generated by the application with &lt;a href=&#34;https://freemacsoft.net/appcleaner/&#34;&gt;AppCleaner&lt;/a&gt;. I ran the &lt;em&gt;onboarding&lt;/em&gt; process again, choosing to &lt;code&gt;Start fresh&lt;/code&gt; without importing configurations from VS Code or Windsurf, and to use the &lt;code&gt;Review driven development&lt;/code&gt; mode without modifying the default options.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/agent-mode.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;I didn&amp;rsquo;t install any extensions, because I prefer to decide for myself what I need.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/extensions.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;One more thing: the version of Antigravity I used to write this post is 1.11.17. However, given the speed of development, it&amp;rsquo;s possible that some aspects of the interface and how the program operates may have changed by the time this post is published.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/version.png&#34; alt=&#34;&#34;&gt;&lt;h5 id=&#34;task-1-extracting-data-from-images&#34;&gt;Task #1: Extracting data from images&lt;/h5&gt;&#xA;&lt;p&gt;Let&amp;rsquo;s start with something simple, yet very important when dealing with old scientific papers: extracting data from images. My goal was to verify whether it is possible to instruct the Antigravity AI to extract data on its own from the figures in &lt;a href=&#34;https://link.springer.com/article/10.1007/s13762-025-06837-z&#34;&gt;this paper&lt;/a&gt;, which has recently been published. I am not at all proud of it (to put it mildly), but the referees surprisingly accepted it almost without a word.&lt;/p&gt;&#xA;&lt;div style = &#34;border-style: solid; border-width: 0px 0px 0px 12px; border-color: #03a87c; background-color: #defef6; padding: 1em; color:black; font-size:100%;&#34;&gt;&#xA;&lt;p&gt;Today, the &lt;a href=&#34;https://www.nature.com/collections/prbfkwmwvz&#34;&gt;reproducibility of results&lt;/a&gt; has become a key principle and it is normal to share all the data from a scientific paper on public repositories like &lt;a href=&#34;https://zenodo.org&#34;&gt;Zenodo&lt;/a&gt;, &lt;a href=&#34;https://osf.io&#34;&gt;OSF&lt;/a&gt;, &lt;a href=&#34;https://dataverse.org/&#34;&gt;Dataverse&lt;/a&gt; or maybe &lt;a href=&#34;https://github.com/&#34;&gt;GitHub&lt;/a&gt;. Until a few years ago, however, the original data were essentially only accessible through the figures in papers. This is precisely why it is important to be able to extract them from images.&lt;/p&gt;&#xA;&lt;p&gt;Although there are many programs dedicated to extracting data from figures (like &lt;a href=&#34;https://akhuettel.github.io/engauge-digitizer/&#34;&gt;this&lt;/a&gt;, &lt;a href=&#34;https://www.digitizeit.xyz/&#34;&gt;this&lt;/a&gt; or &lt;a href=&#34;https://cran.r-project.org/web/packages/metaDigitise/vignettes/metaDigitise.html&#34;&gt;this&lt;/a&gt;), they require user guidance &amp;ndash; at the very least to define the position of the X and Y axes and the coordinate limits. Some programs can work automatically, but only if they deal with well-defined types of images. &lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/br&gt;&#xA;&lt;p&gt;Before starting, I created a new directory in which to save the graphs that I had downloaded from the paper&amp;rsquo;s &lt;a href=&#34;https://link.springer.com/article/10.1007/s13762-025-06837-z&#34;&gt;web page&lt;/a&gt;, just as a colleague who wanted to repeat my experiment would do. While I was at it, I also included the PDF file of the paper, which I will need later.&lt;/p&gt;&#xA;&lt;div style = &#34;border-style: solid; border-width: 0px 0px 0px 12px; border-color: #03a87c; background-color: #defef6; padding: 1em; color:black; font-size:100%;&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;NOTE.&lt;/strong&gt; All of the questions listed below have been copied without correcting typos and clumsy syntax, due (in part) to the excitement of seeing the results quickly. However, these errors are useful because they demonstrate that these systems can understand our requests even when the form is far from perfect.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/br&gt;&#xA;&lt;p&gt;I chose to use &lt;code&gt;Gemini 3 Pro (High)&lt;/code&gt;, which is currently the most powerful model in Antigravity. First, I asked it to orient itself within the directory content.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;In the current directory you will find a pdf file, a couple of folders and some inages in png or webp format. Is all this right?&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Gemini thinks for a couple of seconds and then confirms that everything is correct. Very well, then&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Now open the file containing &#34;Fig1&#34; in the filename. This is an histogram showing the concentration of several chemical species in two kinds of Shungite, type Z (darker bars) and type B (lighter bars). Could you extract the values of concentration from the images? Please produce a table with the name of each chemical species and the associated concentrations for the Type Z and Type B material. Export the table as a csv file named &#34;gemini-fig1.csv&#34; and as a markdown file &#34;gemini-fig1.md&#34;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;This is the image that Gemini had to deal with&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/13762_2025_6837_Fig1_HTML.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Despite me being deliberately vague about the file name, Gemini can find it without problems and, after about twenty seconds, produces the two requested files. The table generated by Gemini is&lt;/p&gt;&#xA;&lt;table&gt;&#xA;  &lt;thead&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;th&gt;Chemical Species&lt;/th&gt;&#xA;          &lt;th&gt;Type Z (Concentration %)&lt;/th&gt;&#xA;          &lt;th&gt;Type B (Concentration %)&lt;/th&gt;&#xA;      &lt;/tr&gt;&#xA;  &lt;/thead&gt;&#xA;  &lt;tbody&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;SiO2&lt;/td&gt;&#xA;          &lt;td&gt;41&lt;/td&gt;&#xA;          &lt;td&gt;52&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;C&lt;/td&gt;&#xA;          &lt;td&gt;33&lt;/td&gt;&#xA;          &lt;td&gt;16&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;Al2O3&lt;/td&gt;&#xA;          &lt;td&gt;11&lt;/td&gt;&#xA;          &lt;td&gt;16&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;Fe2O3&lt;/td&gt;&#xA;          &lt;td&gt;5&lt;/td&gt;&#xA;          &lt;td&gt;3&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;FeO&lt;/td&gt;&#xA;          &lt;td&gt;3&lt;/td&gt;&#xA;          &lt;td&gt;2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;TiO2&lt;/td&gt;&#xA;          &lt;td&gt;0.2&lt;/td&gt;&#xA;          &lt;td&gt;1&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;MnO&lt;/td&gt;&#xA;          &lt;td&gt;0.1&lt;/td&gt;&#xA;          &lt;td&gt;0.2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;S&lt;/td&gt;&#xA;          &lt;td&gt;1.2&lt;/td&gt;&#xA;          &lt;td&gt;0.5&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;MgO&lt;/td&gt;&#xA;          &lt;td&gt;0.5&lt;/td&gt;&#xA;          &lt;td&gt;1.5&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;Na2O&lt;/td&gt;&#xA;          &lt;td&gt;1.5&lt;/td&gt;&#xA;          &lt;td&gt;1.2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;CaO&lt;/td&gt;&#xA;          &lt;td&gt;0.5&lt;/td&gt;&#xA;          &lt;td&gt;2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;K2O&lt;/td&gt;&#xA;          &lt;td&gt;1&lt;/td&gt;&#xA;          &lt;td&gt;2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;H2O&lt;/td&gt;&#xA;          &lt;td&gt;2&lt;/td&gt;&#xA;          &lt;td&gt;1.5&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td&gt;minor elements&lt;/td&gt;&#xA;          &lt;td&gt;0.5&lt;/td&gt;&#xA;          &lt;td&gt;0.5&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;  &lt;/tbody&gt;&#xA;&lt;/table&gt;&#xA;&lt;p&gt;Comparing it with the &lt;em&gt;real&lt;/em&gt; table with the measured values of concentration&lt;/p&gt;&#xA;&lt;table&gt;&#xA;  &lt;thead&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;th style=&#34;text-align: left&#34;&gt;Species&lt;/th&gt;&#xA;          &lt;th style=&#34;text-align: left&#34;&gt;Type_Z&lt;/th&gt;&#xA;          &lt;th style=&#34;text-align: left&#34;&gt;Type_B&lt;/th&gt;&#xA;      &lt;/tr&gt;&#xA;  &lt;/thead&gt;&#xA;  &lt;tbody&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;SiO2&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;40.8&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;52&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;C&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;33.4&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;16.3&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;Al2O3&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;10.6&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;15.8&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;Fe2O3&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;4.7&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;3.2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;FeO&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;3.5&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.9&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;TiO2&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.4&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.9&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;MnO&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.14&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.12&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;S&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.1&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.6&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;MgO&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.3&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.6&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;Na2O&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.6&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;CaO&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.4&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;2.3&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;K2O&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;2.2&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;H2O&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.6&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;1.4&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;      &lt;tr&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;Other&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.46&lt;/td&gt;&#xA;          &lt;td style=&#34;text-align: left&#34;&gt;0.48&lt;/td&gt;&#xA;      &lt;/tr&gt;&#xA;  &lt;/tbody&gt;&#xA;&lt;/table&gt;&#xA;&lt;p&gt;it is immediately apparent that all the values extracted from the figure are very close to the real values, with minimal approximations of less than 0.5%.&lt;/p&gt;&#xA;&lt;p&gt;I am truly impressed by the result: Antigravity did everything in a few seconds without any intervention from me other than preparing the files and writing the prompt. While normal chatbots like ChatGPT or Claude can do similar things, Antigravity allows us to work directly on our computer with our files, without having to transfer anything back and forth via a web interface.&lt;/p&gt;&#xA;&lt;p&gt;Task completed with full marks.&lt;/p&gt;&#xA;&lt;h5 id=&#34;task-2-replication-of-graphs&#34;&gt;Task #2: Replication of graphs&lt;/h5&gt;&#xA;&lt;p&gt;For reasons that are not relevant here, I contributed very little &amp;ndash; and mostly critically &amp;ndash; to the writing of the &lt;a href=&#34;https://link.springer.com/article/10.1007/s13762-025-06837-z&#34;&gt;paper&lt;/a&gt; mentioned in Task #1. Specifically, I didn&amp;rsquo;t prepare the graphs as usual. Instead, they were created by a colleague using &lt;a href=&#34;https://www.mathworks.com/products/matlab.html&#34;&gt;Matlab&lt;/a&gt;, a scientific computing environment much loved by engineers. They are right to love it, because with Matlab you can perform very complicated calculations with little effort. However, producing high-quality graphs is certainly not one of its strengths&lt;/p&gt;&#xA;&lt;p&gt;Let&amp;rsquo;s see if I can get Gemini 3 Pro to recreate the histogram from Task #1 using the &lt;a href=&#34;https://ggplot2.tidyverse.org/&#34;&gt;ggplot2&lt;/a&gt; library in &lt;code&gt;R&lt;/code&gt;.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Now generate the R code to recreate the same graph, using ggplot2 and whenever possible the functions available in the tidyverse.&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;After a while, the script is ready and is saved automatically in the project directory. I give it a quick look and it seems fine. Thus, I open the integrated terminal in Antigravity and run the script as suggested.&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;Rscript recreate_fig1.R&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Apart from inverted colors, the figure is identical to the original.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/recreated_fig1.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;I want to be clear on this point: I didn&amp;rsquo;t touch a single thing of the code generated in Antigravity by Gemini, and I won&amp;rsquo;t do so later, either. Yet Gemini manages to recreate an almost perfect copy of the original in &lt;code&gt;R&lt;/code&gt;, &lt;em&gt;reading&lt;/em&gt; the data in the figure and recognizing all the surrounding elements of the graph, such as the X and Y axis labels, the numerical scales, or the legend.&lt;/p&gt;&#xA;&lt;p&gt;I know R well enough and could have achieved the same result in a short time, but there is no doubt that, even in cases like this, Antigravity can be a valuable help for those who don&amp;rsquo;t know programming well.&lt;/p&gt;&#xA;&lt;p&gt;However, I am a hard guy to please and I ask Gemini to invert the colors&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Sorry, but the colors are inverted. Type Z is light grey and Type B is dark grey. Please switch the colors so that they are identical to the original. Save the new R code as `Rscript recreate_fig1-1.R`&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;I run the new script and now the colors are correct. I am pleased to note that the name of the generated &lt;code&gt;png&lt;/code&gt; file has also been updated, making it easier to compare the different files produced by Gemini.&lt;/p&gt;&#xA;&lt;p&gt;Once again, I am very impressed. Gemini did everything perfectly and I didn&amp;rsquo;t have to correct a single line of code.&lt;/p&gt;&#xA;&lt;p&gt;It was undoubtedly an easy job, but the result is still very interesting. This is especially true given that we are talking about a specialised language like R, which doesn&amp;rsquo;t have the same volume of training examples as more popular languages like Python or JavaScript.&lt;/p&gt;&#xA;&lt;p&gt;It seems that also this task has also been completed with full marks.&lt;/p&gt;&#xA;&lt;h5 id=&#34;task-3-generation-of-new-graphs&#34;&gt;Task #3: Generation of new graphs&lt;/h5&gt;&#xA;&lt;p&gt;Gemini was able to successfully recreate the histogram from the &lt;a href=&#34;https://link.springer.com/article/10.1007/s13762-025-06837-z&#34;&gt;paper&lt;/a&gt;, but I  I would like to check whether there are any more effective alternative representations.&lt;/p&gt;&#xA;&lt;p&gt;I pass the ball back to Gemini 3 Pro, which comes up with two interesting ideas: a &lt;em&gt;Lollipop&lt;/em&gt; chart and a &lt;em&gt;Dumbbell plot&lt;/em&gt;, which however don&amp;rsquo;t fully satisfy me. I try to get it to make some changes, Gemini always performs flawlessly but the result doesn&amp;rsquo;t improve.&lt;/p&gt;&#xA;&lt;p&gt;In the end, the most balanced graph is the usual histogram, this time with the X and Y axes swapped and very different colors for the two types of material.&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/alternative_fig1_horizontal_bar.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Since I am hard to please, I choose a much more complicated figure from the same paper&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/concentrazione_metalli3.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;and I ask Gemini to repeat the process of generating equivalent graphs using &lt;code&gt;R&lt;/code&gt; and &lt;code&gt;ggplot2&lt;/code&gt;. Gemini gets straight to work, understanding by itself that the concentration of the elements at day 0 applies to both types of material, a fact that isn&amp;rsquo;t trivial to understand just by looking at the graph.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34; open&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Answer&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;For Day 0 (&#34;untreated sample&#34;), I assigned the same concentration value to both Type Z and Type B in the dataset. This format suggests that both treatments start from the same baseline, which will make plotting lines easier in R.&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;I&amp;rsquo;ll spare you the details. Gemini easily produces several different types of data visualisations from the image above, such as a &lt;a href=&#34;https://www.google.com/search?q=img/2026-01-21-sorpresa-antigravity/fig4_heatmap.png&#34;&gt;heatmap&lt;/a&gt; or a &lt;a href=&#34;https://www.google.com/search?q=img/2026-01-21-sorpresa-antigravity/fig4_sankey_grid_bold.png&#34;&gt;Sankey plot&lt;/a&gt;. While the graphs weren&amp;rsquo;t 100% satisfactory, they demonstrate that Gemini can perform the tasks required by demanding users such as myself.&lt;/p&gt;&#xA;&lt;p&gt;What truly stuns me is that Gemini understands by itself that, when I asks it to arrange the graphs in two columns, one for each material studied, I actually want what is called a &lt;code&gt;facet&lt;/code&gt; in R jargon. This may seem like a small thing, but it means that these more complex representations become accessible even to those who know what they want but cannot express it precisely.&lt;/p&gt;&#xA;&lt;p&gt;It must be said, however, that as convenient (and even fun) as it is to ask and watch Gemini work, when you get to the finer modifications, you lose much more time writing questions and waiting for answers than working directly on the code. This is a detail to keep in mind.&lt;/p&gt;&#xA;&lt;p&gt;However, it must be said that, although it is convenient and even fun to ask questions and observe Gemini in action, when it comes to the finer modifications one ends up spending much more time writing questions and waiting for answers than working directly on the code. This is an important detail to consider.&lt;/p&gt;&#xA;&lt;p&gt;Let&amp;rsquo;s say that this task deserves a passing grade.&lt;/p&gt;&#xA;&lt;h5 id=&#34;task-4-interaction-with-code-and-operating-system&#34;&gt;Task #4: Interaction with code and operating system&lt;/h5&gt;&#xA;&lt;p&gt;Now, let&amp;rsquo;s see if the Gemini 3 Pro integrated into Antigravity can perform basic file and folder management operations. Although I ran all the tests on macOS, the examples should also apply to various incarnations of Linux and BSD Unix.&lt;/p&gt;&#xA;&lt;p&gt;First, I asked it to tidy up the files generated so far, moving the R scripts to the &lt;code&gt;code&lt;/code&gt; directory, the data files to &lt;code&gt;data&lt;/code&gt;, and the generated graphs to &lt;code&gt;output&lt;/code&gt;. After thinking about it for a moment, Gemini 3 Pro creates a simple shell script to execute the request. It even made an error, noticed it, and asked for authorization to correct it.&lt;/p&gt;&#xA;&lt;p&gt;First, I ask it to tidy up the files generated so far by moving the R scripts to the &amp;lsquo;code&amp;rsquo; directory, the data files to &amp;lsquo;data&amp;rsquo;, and the produced graphs to &amp;lsquo;output&amp;rsquo;. After thinking about it for a moment, Gemini 3 Pro created a simple shell script to perform the task. It makes an error, notices it and asks for authorisation to correct it. Very, very good.&lt;/p&gt;&#xA;&lt;p&gt;But what I like even more is that it automatically updates all the previously written code with the new file paths.&lt;/p&gt;&#xA;&lt;p&gt;I then ask it to add my usual preamble to each &lt;code&gt;R&lt;/code&gt; script, which creates a well-defined R environment. I give it an example from one of my old scripts, but otherwise allow it to do whatever it thinks is best. Gemini does everything perfectly, again updating all the file paths created so far. Truly impressive!&lt;/p&gt;&#xA;&lt;p&gt;Gemini also handles perfectly the conversion of &lt;code&gt;tabs&lt;/code&gt; from two spaces to four spaces, as I like it. In fact, it even recognises the need to do this without introducing formatting errors into the code. Better than a &lt;em&gt;linter&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Lastly, I move on to  try code_versioning_ with &lt;code&gt;git&lt;/code&gt; which is always useful when a project starts to become slightly complicated.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Could you now initialize a local `git` repository for this project? Versioning is very useful as soon as a project starts to become complicated.&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Everything works wonderfully; Gemini creates the requested &lt;em&gt;repository&lt;/em&gt;, makes the initial &lt;em&gt;commit&lt;/em&gt;, and also adds a sensible &lt;code&gt;.gitignore&lt;/code&gt; file. Obviously, I can query the repository &lt;em&gt;status&lt;/em&gt; or perform all the usual &lt;code&gt;git&lt;/code&gt; operations I am used to doing from the Terminal.&lt;/p&gt;&#xA;&lt;p&gt;While I am a bit afraid to allow an &lt;em&gt;agentic&lt;/em&gt; system to perform these operations on my Mac, I must admit that the result is perfect.&lt;/p&gt;&#xA;&lt;p&gt;What I wrote at the end of the previous task still holds true: a user who is familiar with &lt;code&gt;git&lt;/code&gt; commands can work much faster in the Terminal than by asking Gemini. However, for the average researcher (or general user) who is unfamiliar with the Terminal and &lt;em&gt;versioning&lt;/em&gt; systems, using an agentic system like this can be an excellent way to interact with code at a low level without having to study large amounts of pages of documentation.&lt;/p&gt;&#xA;&lt;p&gt;In any case, another task passed with full marks.&lt;/p&gt;&#xA;&lt;h5 id=&#34;task-5-statistical-data-analysis&#34;&gt;Task #5: Statistical data analysis&lt;/h5&gt;&#xA;&lt;p&gt;I won&amp;rsquo;t make this long, because Antigravity/Gemini 3 Pro is a bit weak on statistical analysis.&lt;/p&gt;&#xA;&lt;p&gt;If I let it do whatever it wants, it comes up with nonsense, such as creating correlation matrices between concentrations of different heavy metals or trying to compare &amp;ldquo;differences of effectiveness&amp;rdquo; between the two materials. Admittedly, it&amp;rsquo;s not an easy task, but it shows &amp;ndash; should it still be necessary to demonstrate it &amp;ndash; that these intelligent systems are very useful when given precise instructions, but struggle to work independently when the user doesn&amp;rsquo;t have a clear idea of what they want to achieve.&lt;/p&gt;&#xA;&lt;p&gt;In fact, once I explain properly what kind of data I am dealing with and what I want to achieve:&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;The two types of shungite, type Z and type B, are different materials and they are used independently. In other words, the experiments made using type Z material is completely independent from the experiment with type B, so the two materials cannot be directly compared. This means that the stat_difference_boxplot has little sense, and the same is true for the correlation matrix which photographs a single day (I guess day 0 for only one type of material), while we are interested to time series. Therefore you should try to use some statistical analysis able to deal with time series.&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Gemini thinks about it for a while, makes some mistakes but manages to correct them on its own, and eventually comes up with a small model showing how type Z material absorbs heavy metals more rapidly than type B material. If there were more data available, it would make sense to reflect a bit on the model, but for now, it&amp;rsquo;s enough.&lt;/p&gt;&#xA;&lt;p&gt;Gemini thinks about it for a while, makes some mistakes, but corrects them by itself, and eventually comes up with a small model showing that type Z material absorbs heavy metals more rapidly than type B material. If there were more data available, it would be worth pondering a bit over the model, but for now, this is more than enough.&lt;/p&gt;&#xA;&lt;p&gt;Hard to give a grade here, let&amp;rsquo;s say unclassified.&lt;/p&gt;&#xA;&lt;h5 id=&#34;task-6-extracting-tables-from-pdf-files&#34;&gt;Task #6: Extracting tables from PDF files&lt;/h5&gt;&#xA;&lt;p&gt;Here we enter into hostile territory. Automatically extracting the content of tables in PDF files may seem trivial, but it is a subject that has puzzled hundreds of researchers, myself included. The main problem is that the PDF format is designed to manage the visual aspect of the page, but &lt;em&gt;knows&lt;/em&gt; nothing about its content.&lt;/p&gt;&#xA;&lt;p&gt;For my tests, I asked Antigravity to open the PDF file of one of the &lt;a href=&#34;https://protezionecivile.regione.puglia.it/annali-idrologici-parte-i-documenti-dal-1921-al-2021&#34;&gt;Hydrological Yearbooks&lt;/a&gt; of the Apulia Region, in southern Italy, and extract the minimum and maximum temperature data measured at a few meteorological stations in Apulia.&lt;/p&gt;&#xA;&lt;p&gt;While daily data is now &lt;a href=&#34;https://reteidrometeo.protezionecivile.puglia.it/polarisopen/gis/map&#34;&gt;available online&lt;/a&gt;, the PDF Yearbooks still remain a fundamental resource as they contain temperature and precipitation data from approximately 150 meteorological stations since 1921. This allows us to &lt;a href=&#34;https://iopscience.iop.org/article/10.1088/1742-6596/1065/19/192005&#34;&gt;analyze&lt;/a&gt; how the &lt;a href=&#34;https://www.sciencedirect.com/science/article/abs/pii/S0048969719344067&#34;&gt;region&amp;rsquo;s climate has changed&lt;/a&gt; over the last century.&lt;/p&gt;&#xA;&lt;p&gt;Years ago, I &lt;a href=&#34;https://presentations.copernicus.org/EGU23/EGU23-15072_presentation.pdf&#34;&gt;solved&lt;/a&gt; the problem of extracting data from these PDF files through a combination of &lt;code&gt;R&lt;/code&gt; and &lt;code&gt;AWK&lt;/code&gt; scripts. However, the procedure is long and complex, as many inconsistencies in the tables must be corrected manually.&lt;sup id=&#34;fnref:4&#34;&gt;&lt;a href=&#34;#fn:4&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;4&lt;/a&gt;&lt;/sup&gt; In the end, extracting data for one year takes a couple of hours of careful work -— not to bad given the complexity of the task &amp;ndash; but it would be very useful to delegate this task to an AI.&lt;/p&gt;&#xA;&lt;p&gt;Therefore, I try to have Gemini 3 Pro do it, writing a long prompt where I explain exactly what to do&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;Open &lt;code&gt;annale2013.pdf&lt;/code&gt; and go to page 6, where you will find a pair of data tables with measured precipitations [sic! sarebbe dovuto essere &amp;ldquo;temperatures&amp;rdquo;] for two weather stations. The data is arranged in pairs of columns, and each pair corresponds to a month, &lt;code&gt;G&lt;/code&gt; for &lt;code&gt;gennaio&lt;/code&gt; (january), &lt;code&gt;F&lt;/code&gt; for &lt;code&gt;febbraio&lt;/code&gt; (february), etc,. until &lt;code&gt;D&lt;/code&gt; or &lt;code&gt;dicembre&lt;/code&gt; (december).&lt;/p&gt;&#xA;    &lt;p&gt;As for the pairs, the first column is the minimum temperature and the second column is the maximum temperature.&lt;/p&gt;&#xA;    &lt;p&gt;Each row represents a different day.&lt;/p&gt;&#xA;    &lt;p&gt;At the end of each table there are some summary data, &lt;code&gt;Medie&lt;/code&gt;, &lt;code&gt;Med.mens.&lt;/code&gt; and &lt;code&gt;Med.norm&lt;/code&gt; that you can skip.&lt;/p&gt;&#xA;    &lt;p&gt;Missing data can be marked by either one or more &lt;code&gt;&amp;gt;&lt;/code&gt; or &lt;code&gt;&amp;lt;&lt;/code&gt; signs, or by &lt;code&gt;=&lt;/code&gt; or be simply blank. In some tables there are extraneuous signs that you can skip outright.&lt;br&gt;&#xA;    Read the table and generate two different csv files, one for the maximum temperature and one for the minimum temperature, following this example&lt;/p&gt;&#xA;    &lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;Date,Altamura,Andretta&#xA;    2007-01-01,13.4, 7.7,&#xA;    2007-01-02,15.7, 9.7,&#xA;    2007-01-03, 7.1, 2.7,&#xA;    2007-01-04, 9.5, 7.0,&#xA;    &lt;/code&gt;&lt;/pre&gt;&lt;p&gt;The header contains the date in ISO format, YYYY-MM-DD, and the name of the weather station which is written in bold at the top of each table, just under the months. All other information in thsi area, eg., &lt;code&gt;(Te)&lt;/code&gt;, &lt;code&gt;Bacino&lt;/code&gt; etc. is not useful and can be skipped.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Gemini tries repeatedly to solve the problem using Python, but its scripts produce a lot of errors and skip several days of valid data. After all, it&amp;rsquo;s not an easy problem, I spent months on it myself, so I give it a little help.&lt;/p&gt;&#xA;&lt;p&gt;However, at this point Gemini throws an error and it is impossible to make any further progress.&lt;/p&gt;&#xA;&lt;p&gt;The best option would be to start from scratch, but in the meantime I have reached the &lt;code&gt;quota limit for Gemini 3 Pro (High)&lt;/code&gt; and would have to wait several days to continue.&lt;/p&gt;&#xA;&lt;p&gt;That is clearly not option, so I decide to switch model and try &lt;code&gt;Gemini 3 Flash&lt;/code&gt;, which everyone is talking about right now. Gemini 3 Flash starts working and proposes a plan that seems sensible, but that in practice produces a &lt;em&gt;jumbled mess&lt;/em&gt; due to the presence of &lt;em&gt;extraneous characters and unclear formatting&lt;/em&gt;, as Gemini 3 Flash sadly admits.&lt;/p&gt;&#xA;&lt;p&gt;I therefore recommend using the &lt;code&gt;pdftools&lt;/code&gt; library in R. I have used it myself and it produces far better results than similar Python tools. Finally, bingo! Gemini 3 Flash produces a data file that is 100% correct.&lt;/p&gt;&#xA;&lt;p&gt;To be truly useful I should have it repeat the process for all stations and for several different years, but for now I&amp;rsquo;m satisfied, given that I don&amp;rsquo;t want to waste all my quota on this task. In any case, the result achieved with &lt;code&gt;Gemini 3 Flash&lt;/code&gt; is truly promising and deserves further exploration.&lt;/p&gt;&#xA;&lt;p&gt;On the other hand, it&amp;rsquo;s impossible to achieve anything worthwhile with &lt;code&gt;Claude Sonnet 4.5 (Thinking)&lt;/code&gt;. After trying again and again, I reach the quota limit for this model and have to stop.&lt;/p&gt;&#xA;&lt;p&gt;What about grading this task? Gemini 3 Flash achieves the maximum score, while Gemini 3 Pro should be retested under the same conditions as the smaller model. Lasty, Claude Sonnet 4.5 is unclassified.&lt;/p&gt;&#xA;&lt;h5 id=&#34;task-7-programming-a-microcontroller&#34;&gt;Task #7: Programming a microcontroller&lt;/h5&gt;&#xA;&lt;p&gt;Another area in which Google Antigravity&amp;rsquo;s &lt;em&gt;agentic&lt;/em&gt; functions could prove very interesting is the programming of microcontrollers such as Arduino, Raspberry Pi Pico, or ESP32. There are literally mountains of these little gadgets, and it&amp;rsquo;s difficult to navigate the many different models as well as the thousands of sensors and modules that can be connected to them.&lt;/p&gt;&#xA;&lt;p&gt;In this specific case, an AI system capable of quickly producing working code would be extremely useful.&lt;/p&gt;&#xA;&lt;p&gt;For my tests, I decide to use a &lt;a href=&#34;https://www.arduino.cc/pro/hardware-product-nicla-vision/&#34;&gt;Nicla Vision&lt;/a&gt;, which is essentially an Arduino with a built-in camera. The Nicla Vision is a true gem.  It&amp;rsquo;s not cheap, but it&amp;rsquo;s worth every penny. However, it is also far less common than standard microcontrollers. If Antigravity can program it correctly, it will certainly succeed with normal Arduinos, Pi Picos, or ESP32s.&lt;/p&gt;&#xA;&lt;p&gt;I switch back to using &lt;code&gt;Gemini 3 Flash&lt;/code&gt; and ask if it knows the Nicla Vision. I also provide it with a link to the product webpage to help it locate the device. Gemini confirms that it knows exactly what I am talking about, and we can begin.&lt;/p&gt;&#xA;&lt;p&gt;To program these devices, I have only ever used the &lt;a href=&#34;https://www.arduino.cc/en/software/&#34;&gt;Arduino IDE&lt;/a&gt; and &lt;a href=&#34;https://thonny.org/&#34;&gt;Thonny&lt;/a&gt;, and I don&amp;rsquo;t know if Gemini can manage the microcontroller directly from the editor. When I ask, Gemini replies with a very simple command&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;% ls /dev/tty.usbmodem* /dev/cu.usbmodem* 2&amp;gt;/dev/null || echo &amp;#34;No usbmodem devices found&amp;#34;&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;that allows checking if the Nicla Vision is connected to the Mac. Once it has found it, it checks whether the Python libraries required for communication with the microcontroller are installed. Since they are not, it proposes installing them. It also rightly gives me the option of installing them globally or in a &lt;em&gt;virtual environment&lt;/em&gt;, and needless to say, I choose the latter option.&lt;/p&gt;&#xA;&lt;p&gt;Gemini also prepares a short test script that makes the LED blink (the classic &amp;ldquo;Hello World&amp;rdquo; of the IoT world). When I confirm that everything works, it proudly proclaims:&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34; open&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Answer&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;I&#39;m ready to start writing more complex code for your project. What would you like to build first?&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Up to now, Gemini has done nothing than I could not have done myself with &lt;a href=&#34;https://brew.sh/&#34;&gt;Homebrew&lt;/a&gt; and &lt;a href=&#34;https://github.com/conda/conda&#34;&gt;Conda&lt;/a&gt;. However, I would have had to consult pages and pages of scattered web documentation. Here, instead, I managed to configure everything in just a few minutes and I also learnt many useful things.&lt;/p&gt;&#xA;&lt;p&gt;Since Gemini was so keen to be useful, I asked it to write a small program that uses the Nicla Vision to take photos at set intervals and send them to the Mac via Wi-Fi. The result? Everything worked on the first try.&lt;/p&gt;&#xA;&lt;p&gt;And Gemini didn&amp;rsquo;t even get offended when I told it straight out that I didn&amp;rsquo;t trust it with my Wi-Fi password. In fact, it suggested a more secure alternative. But I am distrustful by nature, so in the end I used my iPhone&amp;rsquo;s hotspot with a disposable password.&lt;/p&gt;&#xA;&lt;p&gt;In conclusion, even the last task is passed with full marks.&lt;/p&gt;&#xA;&lt;h4 id=&#34;leaving-a-trace&#34;&gt;Leaving a trace&lt;/h4&gt;&#xA;&lt;p&gt;&lt;del&gt;One truly annoying thing about Antigravity is that it doesn&amp;rsquo;t have a function to export a complete &lt;code&gt;log&lt;/code&gt; of what you do, with the questions asked and the model&amp;rsquo;s answers, and maybe even its &lt;em&gt;thoughts&lt;/em&gt;.&lt;/del&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Edit.&lt;/strong&gt; I was wrong, the keyboard shortcut &lt;strong&gt;CMD&lt;/strong&gt; (⌘)-&lt;strong&gt;E&lt;/strong&gt; opens the Agent Manager, a separate Antigravity window that, among other things, lets you view the entire history of what Antigravity has done, including its &lt;em&gt;thoughts&lt;/em&gt;. The problem is that there isn’t a specific menu item for the Agent Manager (or at least, I haven’t found one), and the keyboard shortcut is only shown when the program starts.&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Edit #2.&lt;/strong&gt; The Agent Manager can also be opened directly from Antigravity’s toolbar: the corresponding entry is located right next to the icons that allow to change the application layout and perform searches. I don’t know why I didn’t notice it earlier, but in my partial defence I should point out that all the other toolbar commands are represented by icons. For this reason it isn’t immediately clear that this entry is a command rather than just a plain text string, just like the name of the application sitting in the center of the toolbar.&lt;/p&gt;&#xA;&lt;p&gt;Anyway, after every operation, Antigravity lets me read a detailed &lt;code&gt;Walkthrough&lt;/code&gt; of what it did. Right-clicking on the relevant Tab,&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/walkthrough.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;I see that it is possible to &lt;em&gt;reveal&lt;/em&gt; the file in the Finder. This opens a folder with a large number of Markdown files, all having standard names such as &lt;code&gt;implementation_plan&lt;/code&gt;, &lt;code&gt;task&lt;/code&gt;, or &lt;code&gt;walkthrough&lt;/code&gt; followed by the &lt;code&gt;resolved&lt;/code&gt; suffix and a progressive number&lt;/p&gt;&#xA;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2026-01-21-sorpresa-antigravity/agent-files.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;that provide the &lt;em&gt;history&lt;/em&gt; of everything I do with the agentic system.&lt;/p&gt;&#xA;&lt;p&gt;Whenever I request something from one of the integrated &lt;em&gt;agents&lt;/em&gt;, Antigravity generates three files containing: (1) what it must do, &lt;code&gt;task&lt;/code&gt;; (2) the work plan, &lt;code&gt;implementation_plan&lt;/code&gt;; and (3) a summary of what it did, &lt;code&gt;walkthrough&lt;/code&gt;. At first glance, it&amp;rsquo;s not easy to relate these files to each other, but the mechanism deserves a closer inspection. For the record, all the files in question are in a directory inside &lt;code&gt;~/.gemini/antigravity/brain/&lt;/code&gt;, but the main &lt;code&gt;~/.gemini/antigravity/&lt;/code&gt; directory contains many other directories with configuration files, generated code, and more.&lt;/p&gt;&#xA;&lt;h4 id=&#34;conclusions&#34;&gt;Conclusions&lt;/h4&gt;&#xA;&lt;p&gt;It&amp;rsquo;s clear that I enjoyed using Antigravity. In some cases, those who know how to program risks being &lt;a href=&#34;https://mastodon.social/@recantha/115829219381438249&#34;&gt;slowed down rather than helped&lt;/a&gt;, but in general, and in particular for those familiar with Word, Excel and little else, Antigravity&amp;rsquo;s &lt;em&gt;agentic&lt;/em&gt; functions can prove to be a truly valuable help.&lt;/p&gt;&#xA;&lt;p&gt;In my life as a non-professional programmer, I went from writing code while consulting huge books of documentation to searching for answers on &lt;a href=&#34;https://stackoverflow.com&#34;&gt;StackOverflow&lt;/a&gt;. In my view, Antigravity&amp;rsquo;s agentic functions are like &lt;a href=&#34;https://stackoverflow.com/&#34;&gt;StackOverflow&lt;/a&gt; on steroids: not only do they provide answers to our questions, they also interpret and reason about them, proposing solutions almost autonomously.&lt;/p&gt;&#xA;&lt;p&gt;But, just as on Stack Overflow, you should always rely on your own knowledge and experience to distinguish useful, working answers from those that are irrelevant.&lt;/p&gt;&#xA;&lt;p&gt;However, please let&amp;rsquo;s not talk about &lt;em&gt;vibe coding&lt;/em&gt;, i.e. the ability to ask the program to perform a given task and then go for a walk while Antigravity does everything alone! &lt;sup id=&#34;fnref:5&#34;&gt;&lt;a href=&#34;#fn:5&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;5&lt;/a&gt;&lt;/sup&gt; As the examples show, if you are not very precise in explaining what you want and you don&amp;rsquo;t check step-by-step what Antigravity proposes to do, you risk the entire process failing or producing wrong results.&lt;/p&gt;&#xA;&lt;p&gt;Just like a thesis or PhD student, or a young colleague, Antigravity can be a great help if you know what you want and how to achieve it, and can explain it clearly. However, &lt;a href=&#34;https://bigthink.com/starts-with-a-bang/vibe-physics-ai-slop/&#34;&gt;making it come up with its own ideas&lt;/a&gt; &amp;ndash; I don&amp;rsquo;t mean original ideas, just sensible ones &amp;ndash; is still just science fiction for now. Or marketing.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;The mere fact that Google was using code produced by a direct competitor for one of its development tools seemed, shall we say, rather unusual to me.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;It is true that TextMate and BBEdit are not true IDEs (in particular TextMate), but the difference in size is still impressive.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;Several years ago, I developed a &lt;del&gt;C&lt;/del&gt;, er Python program that could do that automatically, but only with images from a particular type of digital oscilloscope.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:4&#34;&gt;&#xA;&lt;p&gt;The imagination of those who compiled the tables deserves an award.&amp;#160;&lt;a href=&#34;#fnref:4&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:5&#34;&gt;&#xA;&lt;p&gt;Even a very busy programming guru like Linus Torvalds uses vibe coding, but only for his own projects. In any case, he only uses it fo &lt;a href=&#34;https://itsfoss.com/news/linus-torvalds-vibe-coding/&#34;&gt;for his little toys&lt;/a&gt;, certainly not for developing the Linux kernel.&amp;#160;&lt;a href=&#34;#fnref:5&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>Photocopied!</title>
      <link>https://static.233.196.69.159.clients.your-server.de/en/2025/12/15/photocopied/</link>
      <pubDate>Mon, 15 Dec 2025 18:00:00 +0000</pubDate>
      <guid>https://static.233.196.69.159.clients.your-server.de/en/2025/12/15/photocopied/</guid>
      <description>&lt;div style=&#34;position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;&#34;&gt;&#xA;      &lt;iframe allow=&#34;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&#34; allowfullscreen=&#34;allowfullscreen&#34; loading=&#34;eager&#34; referrerpolicy=&#34;strict-origin-when-cross-origin&#34; src=&#34;https://www.youtube.com/embed/SVCBA-pBgt0?autoplay=0&amp;amp;controls=1&amp;amp;end=0&amp;amp;loop=0&amp;amp;mute=0&amp;amp;start=0&#34; style=&#34;position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;&#34; title=&#34;YouTube video&#34;&gt;&lt;/iframe&gt;&#xA;    &lt;/div&gt;&#xA;&#xA;&lt;p&gt;The video above is the official presentation of &lt;a href=&#34;https://antigravity.google/&#34;&gt;Google Antigravity&lt;/a&gt;, an IDE (&lt;em&gt;Integrated Development Environment&lt;/em&gt;) that is not just a simple IDE but is &amp;ldquo;a new way of working for this next era of agentic intelligence&amp;rdquo;. I haven’t yet figured out what that truly means, but it surely sounds very smart and up‑to‑date.&lt;/p&gt;&#xA;&lt;p&gt;So let&amp;rsquo;s see what this &amp;ldquo;&lt;em&gt;new way of working&lt;/em&gt;&amp;rdquo; is, in which the agents &amp;ldquo;&lt;em&gt;help you experience liftoff&lt;/em&gt;&amp;rdquo;. An expression that back in the day used to mean something quite different.&lt;/p&gt;&#xA;&lt;p&gt;I download the latest version of Google Antigravity (1.11.17), which once installed takes up a whopping 700 MB. Once I get past the usual onboarding screens and log in with my Gmail account, here&amp;rsquo;s what comes up&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/google-antigravity.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/google-antigravity.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s basically &lt;a href=&#34;https://code.visualstudio.com/&#34;&gt;VS Code&lt;/a&gt; from Microsoft, plus the &lt;em&gt;agentic&lt;/em&gt; tool &lt;code&gt;Antigravity&lt;/code&gt;, whose exact purpose I still don’t know and, more importantly, whether it’s useful.&lt;/p&gt;&#xA;&lt;p&gt;After all, that&amp;rsquo;s essentially what they told me during the onboarding process, when I was presented with a screen offering to either set up the editor from scratch or import settings from VS Code. Or Windsurf.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/google-antigravity-setup.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/google-antigravity-setup.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Windsurf?&lt;/p&gt;&#xA;&lt;p&gt;Yes, &lt;a href=&#34;https://windsurf.com/&#34;&gt;Windsurf&lt;/a&gt;, an editor derived from &lt;a href=&#34;https://cursor.com/&#34;&gt;Cursor&lt;/a&gt;,&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; which in turn was derived from the always-present VS Code.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/windsurf.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/windsurf.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;The similarity (to use a euphemism) becomes obvious when you place them side by side, with Antigravity on the left and Windsurf on the right.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/side-by-side.png&#34;&gt;&lt;img src=&#34;https://static.233.196.69.159.clients.your-server.de/img/2025-12-15-fotocopiato/side-by-side.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Identical.&lt;/p&gt;&#xA;&lt;p&gt;In short, the revolutionary IDE, the &amp;ldquo;&lt;em&gt;new way of working for this upcoming era of agentic intelligence&lt;/em&gt;&amp;rdquo; that &amp;ldquo;&lt;em&gt;helps you liftoff&lt;/em&gt;&amp;rdquo;, is nothing more than a clone of an editor that was already a clone of another editor that was itself a clone&amp;hellip;&lt;/p&gt;&#xA;&lt;div style=&#34;position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;&#34;&gt;&#xA;      &lt;iframe allow=&#34;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&#34; allowfullscreen=&#34;allowfullscreen&#34; loading=&#34;eager&#34; referrerpolicy=&#34;strict-origin-when-cross-origin&#34; src=&#34;https://www.youtube.com/embed/jBXko1pbowA?autoplay=0&amp;amp;controls=1&amp;amp;end=0&amp;amp;loop=0&amp;amp;mute=0&amp;amp;start=0&#34; style=&#34;position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;&#34; title=&#34;YouTube video&#34;&gt;&lt;/iframe&gt;&#xA;    &lt;/div&gt;&#xA;&#xA;&lt;p&gt;&lt;a href=&#34;https://levelup.gitconnected.com/the-2-4b-clone-how-three-companies-carved-up-one-startup-in-72-hours-36039524a9bb&#34;&gt;Digging a little deeper&lt;/a&gt; it is easy to learn that Google spent $2.4 billion to acquire the Windsurf code and hire its two founders, Varun Mohan and Douglas Chen, along with a handful of the company’s developers.&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt; However, fearing antitrust intervention, Google avoided buying the actual product or the brand, so that Windsurf continues to exist and be developed independently, trying to repeat the David versus Goliath story.&lt;/p&gt;&#xA;&lt;p&gt;It is a more or less as if, in the 1980s, Microsoft had hired Steve Jobs and Steve Wozniak, maybe also Bill Atkinson, Andy Hertzfeld, and Susan Kare, taking the original Macintosh code with them. Then re‑releasing it a few months later as Windows 1.0. In the meantime, Apple would have continued to develop the same code for the original Mac.&lt;/p&gt;&#xA;&lt;p&gt;I get the feeling that the Google team used something else to liftoff.&lt;/p&gt;&#xA;&lt;p&gt;But putting the (silly) jokes aside, are the &lt;em&gt;agents&lt;/em&gt; in Antigravity actually useful during development? I’m not sure myself, and that’s what we’ll find out live next time.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;If you&amp;rsquo;re interested in the controversy that followed the latest price hikes at Cursor, you can read about it &lt;a href=&#34;https://medium.com/@chrisdunlop_37984/wtf-is-cursor-doing-with-its-pricing-i-have-to-swap-at-this-rate-63681001a08e&#34;&gt;here&lt;/a&gt;.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;The post is behind a paywall, but you can read it freely &lt;a href=&#34;https://freedium-mirror.cfd/https://levelup.gitconnected.com/the-2-4b-clone-how-three-companies-carved-up-one-startup-in-72-hours-36039524a9bb&#34;&gt;here&lt;/a&gt;.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
  </channel>
</rss>
