Drones: From Harvest to Crosshairs
- Matyas Koszegi

- 6 days ago
- 3 min read
Drones are brilliant. They spray crops with surgical precision. They monitor irrigation better than any farmer walking the field. They film the Winter Olympics from angles that would have required a helicopter and a Hollywood budget ten years ago. They inspect bridges, deliver medicine, map forests, and save mountaineers who thought selfies were more important than weather reports. And then they kill people.

The same class of autonomous systems that fertilize wheat now loiter over battlefields. The same computer vision that stabilizes a camera during a ski jump can identify a vehicle convoy from 600 miles away. The difference is not technological. It is contextual. And the context is war.
For decades, we feared nuclear weapons. They are terrifying, yes. But they are also expensive, centralized, and controlled by a handful of states. Autonomous drones are different. They are cheap, scalable, software-driven and replaceable. As seen in the war in Ukraine, electronic warfare quickly neutralized traditional precision weapons. GPS got jammed, radios got spoofed, missiles lost their way. The solution was not more hardware. It was AI onboard the drone itself.
Systems like the V-BAT by Shield AI operate without GPS or radio guidance. They rely on onboard machine learning models. They navigate by vision and identify targets independently. They are capable of deciding, also are very durable. Seven jammers or maybe sixty miles into hostile territory? Still flying. These things are very durable and accurate.
Here is the uncomfortable part. These drones do not learn in a vacuum. They are trained on massive image datasets. Faces, objects, streets, weather conditions, vehicles, human behavior patterns, you name it.
Where does that data come from? From the platforms we use daily. Local AI-models such as llama by Meta are trained on internet-scale data. Computer vision systems that identify tanks are cousins of the same systems that tag your vacation photos.
Google contributed AI to Project Maven for analyzing drone footage. Amazon and Google provide cloud infrastructure under Project Nimbus. Palantir Technologies aggregates data at a scale that would make any intelligence agency from the 1970s faint.
All of this is marketed as dual use. Which is a polite way of saying the same algorithm can recommend a restaurant or select a target. When you clicked “I agree” on that privacy policy, you did not just enable personalized ads. You helped refine pattern recognition systems. You improved biometric tagging. You contributed to models that can distinguish a farmer from a soldier based on posture and equipment. Data is the new ammunition.
Military strategy today does not begin with tanks. It begins with data collection, satellite tracking, social graph analysis, communication intercepts, sentiment analysis, population profiling etc. Before a single drone lifts off, the digital battlefield is mapped.
Consider the geopolitical tension around Greenland. Surveillance interest precedes territorial ambition. Intelligence agencies do not wait for conflict. They prepare datasets. Modern warfare is predictive analytics with explosives attached.
So Should We Ban Drones?
No. Agricultural drones reduce pesticide waste and increase yield efficiency. In a world facing climate instability and food insecurity. Disaster response drones find survivors faster than any search team on foot. Broadcast drones democratize production. They make global events accessible with minimal infrastructure. These things matters. Technology is not the villain. Centralized, opaque, militarized data pipelines are. The problem is not that drones can fly autonomously. The problem is that the autonomy is trained on surveillance capitalism and deployed in secrecy.
We need a privacy-first countermove. If autonomous weapons are powered by data, then reducing reckless data extraction is not paranoia, but civic hygiene. Here a few things that should be the very basics:
- end-to-end encryption
- minimal social media exposure
- open-source alternatives
- operational security discipline
These are not fringe behaviors. They are rational responses to a world where image recognition models do not forget. On a systemic level, export controls on lethal autonomous systems should be as strict as those on chemical weapons. Transparent audits of military AI procurement should be mandatory. Dual-use claims should face independent oversight.
Drones can help farmers feed cities. They can document history from the sky. They can save lives in avalanches. They do not have to become mass-produced execution devices optimized by your vacation photos. Technology reflects incentives. Incentives reflect policy. Policy reflects pressure.
We built machines that can see. We can still decide what they look at. And perhaps, if we are disciplined enough about our data and serious enough about oversight, the future of drones will hover more often above wheat fields than battlefields.
If you like my posts, consider buying me a coffee. It helps me keep writing. Cheers!




Comments