Autonomous drones in urban warfare and civilian protection standards
If your company develops, operates, or exports autonomous drone systems—or is considering doing so—you need to understand the complex legal landscape governing their use. This article explains the international standards for civilian protection, regulatory requirements under EU law, and the substantial legal risks your business faces if these systems are deployed without proper compliance frameworks.

Article contents
Quick summary
- Autonomous drones fall under EU export controls, the AI Act's high-risk classification, data protection laws and international humanitarian law.
- Both evolving EU standards and international humanitarian law require demonstrable human oversight in targeting and engagement decisions.
- Companies and operators must implement technical safeguards, conduct impact assessments, and maintain detailed records showing compliance.
Understanding the legal framework for autonomous drones
Autonomous drones represent one of the most heavily regulated categories of technology emerging today. If your organization is considering developing, deploying, or exporting these systems, you are operating at the intersection of multiple legal regimes. International humanitarian law (IHL), the EU Artificial Intelligence Act (Regulation 2024/1689), data protection regulations (GDPR), export control rules and national aviation safety standards apply.
Each of these frameworks operates independently, and violations in any one can expose your company to substantial penalties, reputational damage, and operational restrictions.
The term "autonomous drone" itself requires careful definition in legal contexts. An autonomous system in this context refers to an unmanned aerial vehicle (UAV) that can identify targets, make targeting decisions, and apply force with minimal or no human intervention during the critical moment of engagement.
This is fundamentally different from remotely piloted drones, where a human operator maintains continuous control. The distinction matters enormously for legal compliance, because the level of autonomy directly determines which regulations apply and what obligations your company must satisfy.
ARROWS Law Firm regularly advises companies on the intersection of autonomous systems regulation and international obligations.
The international humanitarian law foundation
Before discussing specific regulatory schemes, you must understand the foundational legal principles that govern all armed conflict. These principles form the baseline against which all other regulations are measured, and they apply regardless of the technological sophistication of your systems.
The principle of distinction requires that all parties to an armed conflict distinguish at all times between combatants and civilians.
For autonomous systems, this principle creates an immediate technical problem. If your system cannot reliably distinguish between military and civilian targets with sufficient accuracy, it is prohibited from operating under Additional Protocol I to the Geneva Conventions.
Even when striking a legitimate military objective, the anticipated incidental harm to civilians and civilian property must not be excessive.
For autonomous systems, this principle means your technology must be capable of assessing proportionality in real time. This is a requirement that existing AI systems struggle to meet reliably without human input.
The principle of precautions requires all feasible measures to minimize civilian harm.
Violations constitute war crimes and can trigger individual criminal liability for commanders, operators, and potentially system developers. This applies if they knowingly supply weapons incapable of unlawful use.
Meaningful human control
One of the most important requirements in autonomous weapons regulation is the concept of "meaningful human control" (MHC). This standard has become central to international discussions within the Convention on Certain Conventional Weapons.
MHC requires that a human operator possess sufficient information about the weapon system's capabilities, the operational context, and the target to exercise genuine judgment about whether to engage.
Operators must retain the ability to prevent, interrupt, or override the system's decisions—not merely retrospectively assess them after engagement occurs.
Specific national defense directives require that any autonomous weapon system operating outside specific approved parameters must receive explicit approval from senior defense officials.
microFAQ
1. Does "meaningful human control" mean a human must physically fire each weapon?
Not necessarily for every sub-function, but for the critical decision to apply lethal force, human authorization is legally required to ensure IHL compliance. The key is that critical decision-making authority regarding life and death rests with the human.
2. Can a human monitor multiple autonomous systems simultaneously and still maintain meaningful control?
This depends on system complexity, response speed, and operational context. Regulators scrutinize claims that one operator can meaningfully control multiple high-speed autonomous systems in dynamic urban environments. Your company must document that the human cognitive load allows for effective intervention.
3. What happens if autonomous system AI improves faster than human operators can retrain?
This creates serious compliance risk. You must implement governance measures ensuring human training and system development proceed in parallel.
The EU AI act's high-risk classification
The European Union's Artificial Intelligence Act (Regulation (EU) 2024/1689) is fully applicable as of 2026. While the regulation explicitly exempts AI systems used exclusively for military purposes, this exemption is narrower than many companies assume.
The military exemption applies only when the AI system is used exclusively for military purposes.
When an autonomous system is subject to the AI Act, it typically falls under the "high-risk" classification. This is particularly relevant if it involves safety components of vehicles or biometric identification.
High-risk AI system requirements include establishing a risk management system and ensuring data governance.
Failure to comply with the AI Act carries massive penalties. Fines can reach up to €35 million or 7% of total worldwide annual turnover for using prohibited AI practices.
Your company must document how the AI system will function, what datasets were used to train it, how it was tested, and how it will maintain performance in the field.
ARROWS Law Firm can provide comprehensive advice on AI Act compliance, including risk assessments, technical documentation preparation, and representation.
Export controls for autonomous drone technology
If your company manufactures or exports autonomous drone technology, you are subject to European Union export control regulations governing dual-use items under Regulation (EU) 2021/821. Dual-use items are goods, software, and technology that can be used for both civilian and military applications.
Before exporting these items to countries outside the EU, your company must obtain an export license from the competent national authority.
The authorization procedure requires meticulous documentation. You must complete detailed export applications identifying the buyer, end-user, intended use, technical specifications, and quantity.
Even items not listed can trigger licensing requirements if the exporter is aware they are intended for military end-use in a country subject to an arms embargo.
Violations of the Czech Act on Control of Exports of Dual-Use Goods and Technologies carry severe penalties. The fine can reach CZK 20,000,000 or five times the value of the goods, whichever is higher.
microFAQ
1. Does my company need an export license if we are only exporting to other EU members?
Generally no, for most dual-use items, but transfer restrictions apply to extremely sensitive items listed in Annex IV of the Regulation. Record-keeping is still mandatory.
2. Can we export as long as we declare the civilian end-use?
Not automatically. Authorities assess the risk of diversion to military use. If the end-user is a military entity or located in a high-risk jurisdiction, the license may be denied regardless of the stated purpose.
3. What if we don't know if our component qualifies as dual-use?
You have an obligation to determine this. You can apply for a preliminary classification or information from the Ministry of Industry and Trade. Ignorance is not a defense against administrative or criminal liability.
Data protection and privacy
Autonomous drones equipped with cameras, sensors, or data collection capabilities fall under the General Data Protection Regulation (GDPR) whenever they collect information capable of identifying individuals. This applies to commercial and dual-use operations.
A critical ruling by the Court of Justice of the European Union confirmed that video surveillance by a camera recording a public space constitutes processing of personal data.
Your company cannot operate autonomous drones for testing, surveillance, or commercial demonstration without implementing GDPR compliance measures if identifiable data is captured.
This includes establishing a lawful basis for processing and conducting a Data Protection Impact Assessment (DPIA).
ARROWS Law Firm has extensive experience helping companies navigate GDPR compliance for drone operations.
Czech aviation regulations
The Czech Republic implements the uniform European regulatory framework for unmanned aircraft systems based on EASA regulations. The framework divides drone operations into three categories based on risk level.
The Open Category covers low-risk flights where drones must bear a class identification label.
The Specific Category covers moderate-risk flights, including Beyond Visual Line of Sight (BVLOS). This requires an operational authorization from the Czech Civil Aviation Authority based on a risk assessment.
The Certified Category covers high-risk operations like transport of people or dangerous goods.
For autonomous drones in urban environments, especially those carrying payloads, the Certified Category or high-risk Specific Category authorization is mandatory.
The accountability gap
One of the most pressing legal challenges is establishing accountability. The International Criminal Court prosecutes individuals, not corporations.
Commanders and operators can be held liable for war crimes if they deploy autonomous weapons in violation of IHL.
Under the Czech Act on Criminal Liability of Legal Persons, a company can be criminally prosecuted for specific offenses committed in its interest. Penalties include dissolution of the company, forfeiture of property, and bans on activity.
If an autonomous system malfunctions and causes damage, the manufacturer or operator faces civil liability for damages.
ARROWS Law Firm advisors work with defense and security companies on governance frameworks to ensure clear lines of responsibility.
|
Risks and Sanctions |
How ARROWS helps (office@arws.cz) |
|
Export control violations: Exporting dual-use drone technology without proper licensing results in fines up to CZK 20 million or 5x the value of goods, plus forfeiture and potential criminal prosecution. |
Pre-export licensing compliance: ARROWS Law Firm conducts technical classification of your systems, prepares detailed export applications, manages Ministry coordination, and ensures compliance with EU and Czech export control frameworks. |
|
GDPR violations: Operating drones that capture identifiable individuals without proper legal basis triggers fines up to €20 million or 4% of turnover, plus potential civil claims. |
Data Protection Impact Assessments: ARROWS Law Firm conducts comprehensive DPIAs, implements GDPR-compliant data handling procedures, and advises on lawful bases for processing in security contexts. |
|
AI Act non-compliance: Using prohibited AI practices or failing to meet high-risk obligations results in fines up to €35 million or 7% of turnover (prohibited practices) or €15 million or 3% (obligations). |
AI Act compliance architecture: ARROWS Law Firm develops risk management frameworks, creates technical documentation, and prepares your organization for regulatory conformity assessments. |
|
War crimes liability: Using systems that violate IHL principles exposes commanders and potentially developers to individual criminal liability under international law. |
IHL governance and Article 36 reviews: ARROWS Law Firm structures legal review procedures for new weapons, conducting analysis to ensure systems are capable of operating within the bounds of international humanitarian law. |
|
Civil liability: Defects in autonomous systems causing damage can lead to massive compensation claims under the Product Liability Directive and national civil codes. |
Liability defense and risk mitigation: ARROWS Law Firm advises on product liability risks, insurance coverage, and contractual limitations of liability where applicable. |
The international dimension
The regulation of autonomous drones is fundamentally international. While your company operates from the Czech Republic, EU law, international humanitarian law, and export controls span multiple jurisdictions.
ARROWS Law Firm, as a leading law firm based in Prague, has extensive experience advising Czech and international companies.
Additionally, if your systems are used in armed conflicts abroad, they are subject to international humanitarian law regardless of where they were manufactured. International bodies and NGOs actively monitor the deployment of such technologies.
Practical governance framework
Understanding these legal requirements is insufficient; your company must implement governance structures ensuring compliance in actual operations.
Risk management systems must be established before system development begins.
Technical documentation must be maintained throughout the system lifecycle. This documentation serves multiple purposes: it demonstrates AI Act compliance and supports Article 36 weapons review.
Training and qualification of operators and commanders must be documented.
ARROWS Law Firm can assist with implementing these governance frameworks, reviewing your procedures against legal requirements, and representing you in interactions with regulators.
microFAQ
1. When should legal review of autonomous system design begin?
During the requirements definition phase. "Compliance by design" is cheaper and safer than retrofitting a developed system.
2. Who is responsible for legal compliance?
Responsibility should be shared between technical leadership (CTO) and a designated compliance officer or legal counsel. For dual-use exports, a specific responsible representative must be appointed vis-à-vis the Ministry.
3. What if our testing reveals our system cannot reliably distinguish civilians?
You cannot lawfully deploy or export the system for autonomous targeting functions. You must restrict the system to non-autonomous modes or redesign the sensor/AI fusion.
Executive summary for management
The regulatory environment for autonomous drones in urban warfare contexts presents substantial compliance challenges and financial risk. Your organization faces overlapping obligations under EU export control regulations, the AI Act, GDPR, Czech aviation law, and international humanitarian law.
Companies that proactively implement governance frameworks and document meaningful human control substantially reduce legal exposure.
ARROWS Law Firm has advised numerous companies on these issues and maintains specialized expertise in autonomous systems regulation, export controls, and AI compliance.
Conclusion of the article
Autonomous drones operating in urban warfare contexts represent cutting-edge technology governed by equally sophisticated legal frameworks. The regulatory burden is genuine: export controls, AI Act requirements, GDPR obligations, and aviation regulations apply simultaneously.
The interaction between the EU AI Act's high-risk classification and the IHL requirement for meaningful human control creates overlapping requirements.
ARROWS Law Firm has helped commercial and security companies develop compliant autonomous systems by integrating legal oversight into development.
FAQ – Frequently asked legal questions
1. Can our company legally develop autonomous drone systems that operate without human authorization to engage targets?
Developing fully autonomous lethal systems (LAWS) that select and engage targets without any human intervention faces extreme legal and ethical scrutiny. While not explicitly banned by a specific treaty yet, they may be deemed illegal under IHL if they cannot comply with distinction and proportionality. It is strongly advised to maintain a "human-in-the-loop" or "human-on-the-loop" architecture.
2. If our drone system is designed for civilian commercial use, do we need export licenses?
Yes. Autonomous drone technology is a classic "dual-use" item. You must obtain an export license from the Ministry of Industry and Trade before exporting outside the EU. Intra-EU transfers of certain highly sensitive surveillance drones may also require authorization or notification.
3. What penalties does our company face for operating drones that collect personal data without GDPR compliance?
Penalties can reach up to €20 million or 4% of global annual turnover. Additionally, the Czech Office for Personal Data Protection (UOOU) can order a ban on processing, effectively grounding your fleet.
4. Our defense ministry wants to deploy our autonomous drone system. What legal review must we conduct?
You must support the state's "Article 36 Review" (under AP I to Geneva Conventions). This involves providing technical data proving the weapon can be used discriminately and proportionally. You should also conduct internal product safety and liability assessments.
5. If our autonomous drone system causes civilian casualties, could company leadership face criminal liability?
Generally, IHL liability falls on the military commander. However, if company leadership knowingly supplied defective equipment or equipment designed to violate IHL, or violated export controls to supply a prohibited end-user, criminal liability under national laws is possible.
6. How do the EU AI Act requirements interact with international humanitarian law?
They are distinct. The AI Act (if applicable via dual-use) focuses on product safety, fundamental rights, and transparency within the EU market. IHL focuses on the conduct of hostilities. A system could theoretically be safe under the AI Act but unlawful under IHL. You must comply with both sets of rules where applicable.
Disclaimer: The information contained in this article is for general informational purposes only and serves as a basic guide to the issue. Although we strive for maximum accuracy in the content, legal regulations and their interpretation evolve over time. To verify the current wording of the regulations and their application to your specific situation, it is therefore necessary to contact ARROWS Law Firm directly (office@arws.cz). We accept no responsibility for any damage or complications arising from the independent use of the information in this article without our prior individual legal consultation and expert assessment. Each case requires a tailor-made solution, so please do not hesitate to contact us.
Read also
- Digital Inspections and AI in 2026: New EU Compliance Duties for Firms
- How ARROWS Uses AI to Enhance Legal Work While Ensuring Accountability
- Legal Challenges of Human Control in Autonomous Military Systems
- Hybrid Warfare: Linking Drones, Cyberattacks, and Disinformation
- Cybersecurity, Drone Hijacking, and Liability for Damage
- Legal Validity and Risks of Simple Electronic Signatures in Czech Law
- ARROWS lawyers helped connect the worlds of technology and business