Download on the App Store

Is it ethical to use drones for targeted killings?

Opening Statement

The opening statement sets the foundation of any debate, establishing not only the team’s position but also the moral and logical framework through which the issue will be judged. In the case of “Is it ethical to use drones for targeted killings?”, this moment is crucial—it determines whether we view drone strikes as a necessary evolution in defense strategy or a dangerous erosion of human rights and global norms. Below are the opening statements from both the affirmative and negative sides, each delivering a clear, coherent, and ethically grounded perspective.

Affirmative Opening Statement

“Ladies and gentlemen, esteemed judges, opponents—we stand here today not to glorify war, but to defend a tool that, when used responsibly, saves lives, protects national security, and upholds a higher standard of ethical warfare. We affirm that it is ethical to use drones for targeted killings—and not merely permissible, but often morally obligatory.

First, drone strikes minimize collateral damage. Unlike carpet bombing or ground invasions that risk entire villages, drone technology allows surgical precision. According to a 2021 report by the Council on Foreign Relations, U.S. drone strikes in Pakistan resulted in significantly fewer civilian casualties per operation than traditional airstrikes. By reducing unintended deaths, drones align more closely with the principle of distinction in just war theory—the moral imperative to differentiate between combatants and non-combatants.

Second, governments have a duty to protect their citizens from imminent threats. When intelligence confirms that a terrorist leader is planning an attack that could kill hundreds, waiting for legal proceedings or boots on the ground may cost innocent lives. In such cases, a drone strike is not an act of vengeance, but an act of prevention—an ethical form of self-defense in a world where enemies do not wear uniforms or declare war.

Third, drones represent the evolution of warfare ethics in the 21st century. Asymmetric threats like ISIS, Al-Qaeda, and transnational terror networks operate outside state borders and legal systems. To fight them, we must adapt—not regress. Drones allow us to act decisively while minimizing troop deployment, reducing the risk of prolonged occupation and its associated human costs.

We acknowledge concerns about misuse—but the existence of abuse does not negate the legitimacy of responsible use. Just as we don’t ban surgery because of malpractice, we shouldn’t reject drones because of flawed implementation. With proper oversight, transparency, and adherence to international law, drone strikes can be—and already are—a more humane way to wage necessary wars.”

Negative Opening Statement

“Thank you. We oppose the use of drones for targeted killings—not because we condone terrorism, but because we uphold the rule of law, human dignity, and the very foundations of ethical governance. To normalize extrajudicial executions carried out by remote-controlled machines is to cross a moral Rubicon from which there may be no return.

Our first argument is rooted in sovereignty and international law. Targeted drone strikes often occur in countries without formal declarations of war and without the consent of their governments. In Yemen, Somalia, and Pakistan, the U.S. has conducted hundreds of strikes—many in violation of Article 2(4) of the UN Charter, which prohibits the use of force against the territorial integrity of any state. When one nation unilaterally decides who lives or dies on another’s soil, it sets a precedent that erodes global order and invites retaliation.

Second, drone killings bypass due process. No trial. No defense. No appeal. A person is labeled a ‘threat’ based on secret intelligence, often with minimal oversight, and then erased from existence by a pilot thousands of miles away. This is not justice—it is assassination. Even if the target is guilty, the method undermines the principle that all human beings deserve a fair legal process. As former UN Special Rapporteur Agnes Callamard stated, ‘The practice of targeted killing violates the right to life, the right to due process, and the prohibition of arbitrary execution.’

Third, drones create moral hazard through psychological detachment. The operator presses a button in Nevada while watching a screen, detached from the gravity of taking a human life. This distance reduces killing to a video game-like experience, weakening moral inhibition. Studies from the Journal of Military Ethics show that drone operators, despite being physically safe, suffer high rates of PTSD—precisely because they witness the aftermath of their actions in vivid detail, yet lack the closure of battlefield engagement. If even the killers are traumatized, how can we claim this is a civilized form of warfare?

We are not arguing for inaction. We support robust intelligence, diplomacy, and lawful military responses. But when we replace courts with cruise missiles and judges with joystick operators, we sacrifice our humanity in the name of security. And once that line is crossed, who decides who is next?”

Rebuttal of Opening Statement

This phase transforms abstract principles into direct confrontation. Having laid out their visions of justice, security, and morality, both teams now engage in intellectual combat—challenging not only what was said, but how it holds up under scrutiny. The second debaters step forward not merely to defend, but to destabilize the opposition’s foundation while reinforcing their own.

Affirmative Second Debater Rebuttal

Let me begin by addressing the core illusion at the heart of the negative case: that there exists some pristine alternative to drone strikes—a method of neutralizing imminent threats that is perfectly legal, completely bloodless, and universally accepted. That world does not exist. And in its absence, we must make difficult choices guided by proportionality, necessity, and restraint.

First, they claim drone strikes violate national sovereignty. But let us be honest: when a terrorist network operates freely within a failed state or from a region where the host government lacks control—or worse, tacitly supports extremism—does sovereignty become a shield for mass murder? If Al-Shabaab plans attacks on Nairobi from southern Somalia, and the Somali government cannot or will not stop them, do we wait until hundreds die before acting? The UN Charter allows for self-defense under Article 51. Drone strikes, conducted with care and intelligence, fall squarely within that right. To insist on absolute territorial integrity in such cases is to prioritize form over human life.

Second, they invoke due process as an absolute bar to targeted killings. Yet they ignore the reality of non-state actors who operate outside legal systems, reject trials, and seek only destruction. Due process is not a one-size-fits-all mechanism. A terrorist plotting a chemical attack in a subway doesn’t get a courtroom—he gets intercepted. Law enforcement methods apply where feasible, but in transnational conflict zones, military response is not a failure of justice; it is its extension. We are not executing citizens without trial—we are preventing atrocities by individuals actively engaged in war against civilians.

And finally, the emotional appeal about “joystick warfare” and PTSD among operators—while poignant—is dangerously misleading. Yes, drone pilots experience moral injury. But so do soldiers who witness friends blown apart by IEDs, or medics treating children maimed by suicide bombers. The difference? With drones, fewer boots on the ground mean fewer soldiers exposed, fewer civilian casualties, and less long-term occupation. Is it really more humane to send troops into dangerous raids that risk dozens of lives when a single precise strike can eliminate the threat?

They speak of dignity—but whose dignity? The terrorist’s, or the victims he intends to slaughter? Ethics cannot be reduced to procedural purity if it costs innocent lives. We choose prevention over passivity, precision over pandemonium, and responsibility over recklessness.

Negative Second Debater Rebuttal

The affirmative paints a picture of surgical justice—clean, rational, and restrained. But behind that glossy image lies a system built on secrecy, error, and unchecked power. They praise precision, yet cannot explain why hundreds of civilians have died in so-called "targeted" strikes. They cite legality, yet operate in countries where no war has been declared. They talk of duty to protect, but offer no accountability when protection goes awry.

Let’s start with their claim of minimal collateral damage. Data from Airwars and Bureau of Investigative Journalism shows that between 2014 and 2019, U.S. drone operations in Iraq and Syria likely killed over 8,000 civilians—not the handful they admit. Why the gap? Because the Pentagon defines all military-age males in a strike zone as combatants unless proven otherwise. That isn’t precision—that’s presumption. That isn’t ethics—that’s erasure.

Then there’s their argument about self-defense. Fine—if the threat is imminent. But too often, “imminence” is stretched beyond recognition. The U.S. has killed suspects based on “signature strikes”—that is, patterns of behavior deemed suspicious, without knowing the individual’s identity. You don’t get executed because you look like a threat. Or at least, you shouldn’t—if we still believe in innocence until proven guilty.

And let’s address the elephant in the room: oversight. The affirmative says, “With proper regulation, drones are ethical.” But where is that regulation? The CIA runs covert programs with zero transparency. Targets are placed on kill lists based on intelligence so classified even Congress struggles to review it. There is no appeals process. No independent verification. Just a name, a location, and a green light.

They compare drones to surgery. But surgery requires consent, diagnosis, and a licensed practitioner. Here, the patient never sees the doctor, the diagnosis is hidden, and the surgeon answers to no board. If that’s medicine, it’s malpractice on a global scale.

Finally, they dismiss our concern about psychological detachment as mere sentimentality. But this isn’t about feelings—it’s about moral erosion. When killing becomes routine, remote, and sanitized, it lowers the threshold for violence. It makes war easier to start and harder to stop. History teaches us that every technological leap in warfare—from the longbow to the nuclear bomb—demands stronger ethical guardrails, not weaker ones.

We are told drones save lives. Perhaps. But at what cost to the soul of our civilization? Security without law is tyranny. Power without limits is hubris. And death without dignity—for victim and perpetrator alike—is not progress. It is regression dressed in high-tech camouflage.

Cross-Examination

In the crucible of debate, no moment tests rigor like cross-examination. Here, arguments are no longer monologues—they become dialogues under duress. The third debaters step forward not to repeat, but to interrogate. Their mission: to corner opponents into contradictions, extract damaging concessions, and crystallize their own framework as the only coherent path forward.

With alternating turns beginning from the affirmative side, the exchange unfolds with surgical precision. Each question is a scalpel; each answer, an involuntary confession. This is not conversation—it is controlled demolition.

Affirmative Cross-Examination

Affirmative Third Debater:
To the Negative First Debater: You argued that drone strikes violate national sovereignty and thus erode global order. But if a terrorist group operates freely within a state that lacks effective control—say, ISIS remnants in eastern Syria—does your definition of sovereignty require us to wait until they launch attacks across borders before acting?

Negative First Debater:
We do not advocate passivity. But action must be multilateral, transparent, and authorized through international mechanisms like UN Security Council resolutions. Unilateral strikes set dangerous precedents.

Affirmative Third Debater:
So you admit intervention may be necessary—but only with permission. Then let me ask the Negative Second Debater: When intelligence confirms a high-value target is preparing a radiological device in a non-cooperative zone, and seeking international consensus would take weeks, does your insistence on procedural legitimacy mean we allow mass casualties to preserve diplomatic formality?

Negative Second Debater:
That is a false dichotomy. There are lawful alternatives—special operations, intelligence sharing, interdiction. We reject the idea that legality must be sacrificed for speed.

Affirmative Third Debater:
Then finally, to the Negative Fourth Debater: You claim drones create moral hazard because operators kill remotely. But soldiers who drop bombs from 30,000 feet also avoid physical risk. Is altitude—or distance—the real issue? Or is your objection actually to effectiveness: that drones make it too easy to eliminate threats without costly wars?

Negative Fourth Debater:
It’s not about difficulty—it’s about normalization. The ease lowers the threshold for lethal force. A pilot flying over a battlefield sees the chaos of war. A drone operator watches a screen, selects a target, and goes home for dinner. That psychological disconnection changes the nature of killing.

Affirmative Cross-Examination Summary:
Thank you. What we’ve heard confirms our case. The opposition insists on perfect legality and absolute sovereignty—but offers no viable response when evil exploits legal gray zones. They condemn drone strikes for bypassing courts, yet provide no mechanism to try terrorists who reject all law. And while they decry “joystick warfare,” they cannot distinguish it meaningfully from other forms of remote engagement. Their ethics are rigid where reality is fluid. They demand ideal processes in a world of urgent peril—and when pressed, retreat into abstraction. If ethics cannot adapt to asymmetric threats, then ethics become complicity in catastrophe.

Negative Cross-Examination

Negative Third Debater:
To the Affirmative First Debater: You cited low civilian casualties in Pakistan as proof of drone precision. But multiple NGOs report far higher numbers than official claims. If the government defines all military-age males near a strike site as combatants unless proven otherwise, doesn’t that inflate success rates and erase civilian deaths?

Affirmative First Debater:
We rely on verified data from independent think tanks, not unverified NGO estimates. While mistakes happen, the trend shows improvement in targeting accuracy over time.

Negative Third Debater:
Then to the Affirmative Second Debater: You defended “signature strikes”—targeting individuals based on behavior patterns, not identity. So if someone is seen meeting with suspected militants, traveling at night, or carrying a radio, does your standard permit killing them even if they’re a farmer, a journalist, or a local official?

Affirmative Second Debater:
Signature strikes are used sparingly and only when behavioral patterns strongly indicate hostile intent. Intelligence fusion minimizes errors. We don’t act on isolated signals.

Negative Third Debater:
And to the Affirmative Fourth Debater: You say oversight exists. But CIA drone programs are classified. Targets are placed on kill lists approved internally, without judicial review. In 2013, Senator Dianne Feinstein admitted Congress was kept in the dark. So when there’s no transparency, no appeal, and no accountability—who ensures the system isn’t abused?

Affirmative Fourth Debater:
Oversight occurs through executive branch reviews, interagency coordination, and post-strike assessments. While full public disclosure isn’t possible in covert ops, internal checks prevent arbitrary decisions.

Negative Cross-Examination Summary:
Ladies and gentlemen, the cracks are showing. The affirmative team champions precision, yet defends a system that presumes guilt by proximity. They celebrate restraint, yet endorse signature strikes—killing people not for who they are, but for how they look. And when asked about accountability, they fall back on “trust us”—a secret process, hidden criteria, invisible lists. This isn’t justice. It’s execution by algorithm and assumption. They speak of ethics, but build their case on secrecy and selective data. If this is the future of warfare, then we have already lost the war for human dignity.

Free Debate

(Affirmative First Debater)
You say drones violate sovereignty—but when terrorists operate from ungoverned spaces, whose sovereignty are we honoring? The warlord hiding in the mountains of Waziristan? Or the millions who suffer under his reign of terror? If sovereignty protects violence, then it has failed its moral duty. We act not because we reject law, but because we uphold a higher one—the right of innocent people to live free from fear.

(Negative First Debater)
And who decides which fears justify assassination? You speak of moral duty, yet your policy lets a bureaucrat in Langley check a box and erase a life. That’s not moral duty—that’s managerial murder. When did national security become a blank check signed in invisible ink?

(Affirmative Second Debater)
At least it’s a check that weighs proportionality and intelligence. Your alternative? Wait for another 9/11-scale attack before acting? Let’s be clear: we’re not debating whether killing is tragic—we all agree it is. We’re asking: when faced with imminent threat, is it more ethical to act precisely or do nothing? Silence is not neutrality—it’s complicity.

(Negative Second Debater)
Complicity in what? Preventing attacks? Or normalizing secret courts run by algorithms and anonymous analysts? You claim precision, yet over 200 children died in a single strike in Kunduz—was that “surgical”? Or just sloppy arithmetic dressed up as strategy?

(Affirmative Third Debater)
One tragedy does not invalidate a tool any more than one plane crash grounds all aviation. Yes, mistakes happen—and each one must be investigated, mourned, and learned from. But to reject drones entirely is to demand perfection from imperfect humans in impossible situations. Should we also ban ambulances because sometimes they hit pedestrians?

(Negative Third Debater)
Ah, now we’re comparing drones to ambulances? How charming. But last time I checked, ambulances don’t use license plates to determine who lives or dies. Yet your “signature strikes” target men simply because they move like militants, talk to the wrong people, or visit the wrong village. That’s not medicine—it’s profiling with missiles.

(Affirmative Fourth Debater)
So your solution is paralysis? Because intelligence isn’t perfect, we do nothing—even when satellites show bomb-making factories and intercepted calls confirm suicide missions? Let me ask you: if your child were on a school bus heading into a known ambush zone, and you could stop one car with a single shot, would you hesitate just because the shooter might be misidentified? Ethics isn’t found in inaction—it’s found in responsible intervention.

(Negative Fourth Debater)
That’s a false dilemma—you’ve turned a complex geopolitical reality into a Hallmark card. Real ethics means building systems that prevent such choices, not weaponizing emotional blackmail to justify remote-control revenge. And let’s not forget: most drone strikes aren’t stopping buses—they’re killing suspects during routine activities. A man eats dinner, prays, walks outside—boom. Is dinner now a capital offense?

(Affirmative First Debater)
You keep calling it “assassination,” but legally and morally, there’s a difference between murdering a diplomat and eliminating an enemy commander actively planning mass murder. Under the Geneva Conventions, combatants don’t get immunity just because they hide behind Wi-Fi signals. War has moved online—should our ethics stay stuck in 1945?

(Negative First Debater)
Then update the rules! Don’t bypass them. There’s a reason we have laws of war—to prevent exactly this kind of unilateral reinterpretation. When every nation claims the right to kill anyone, anywhere, based on secret evidence, we don’t have a new world order—we have global vigilantism. Next thing you know, China will drone-strike a Uighur activist in Turkey and call it “counter-terrorism.”

(Affirmative Second Debater)
And that’s why oversight matters! We’re not advocating cowboy operations. We support congressional review, international norms, and transparent reporting. But let’s not throw out the baby with the bathwater—or in your case, drown the entire neighborhood because you dislike the plumbing.

(Negative Second Debater)
Transparency? Really? How transparent was the strike that killed Iranian General Qasem Soleimani—on foreign soil, without warning, escalating tensions overnight? Was Congress informed? Were allies consulted? Or was it just another “oops-we-did-it-again” moment wrapped in patriotic packaging?

(Affirmative Third Debater)
General Soleimani wasn’t some innocent bystander—he directed attacks that killed hundreds of American troops. To treat his elimination as scandalous is to rewrite history. If you’d applied that standard in World War II, we’d still be debating whether Hitler qualified for due process.

(Negative Third Debater)
Oh, so now we’re fighting Nazis with Netflix-style recommendation engines? “Because he’s watched extremist content, he might commit evil—target acquired.” This isn’t justice—it’s judgment by algorithm. And when machines help decide death, humanity gets outsourced.

(Affirmative Fourth Debater)
Human beings make the final call—not algorithms. Intelligence informs, humans decide. Unlike ground invasions that unleash chaos, drones allow restraint. They reflect a more humane ethic—one that minimizes risk to soldiers and civilians. Isn’t that progress?

(Negative Fourth Debater)
Progress? Or convenience? It’s easy to press a button when you can’t hear the screams. Easy to call it “clean” when you never see the crater filled with children. You celebrate reduced troop exposure—but at what cost to our moral exposure? When killing becomes risk-free for us, do we lose the instinct to avoid war altogether?

(Affirmative First Debater)
So courage now requires getting killed? Should we send troops into caves just to prove we care? Ethics isn’t measured by how many of our own die—it’s measured by how many innocents we protect. Drones do that better than any tool we’ve ever had.

(Negative First Debater)
Better? Maybe. Ethical? Not unless it’s accountable. Right now, there’s no independent court to challenge a name on a kill list. No habeas corpus. No day in court. Just silence, surveillance, and sudden fire from the sky. That’s not ethics—that’s execution by bureaucracy.

(Affirmative Second Debater)
And your ideal world? Where we wait for attacks to happen because we lack the courage to stop them preemptively? Where we tie our own hands while enemies cut theirs loose? That’s not principle—that’s posturing with other people’s lives.

(Negative Second Debater)
No—our principle is simple: no one should hold the power of life and death without answerability. Not kings, not generals, not drone operators watching feeds with popcorn. Power demands limits. Otherwise, we don’t defend civilization—we erode it slowly, one clean strike at a time.

(Affirmative Third Debater)
Then let’s build those limits—not abolish the tool. Regulate, don’t prohibit. Improve oversight, don’t retreat into paralysis. Because the real injustice isn’t using drones wisely—it’s failing to protect the vulnerable out of ideological purity.

(Negative Third Debater)
And the real danger isn’t regulation—it’s normalization. Every time we accept another strike, another secret list, another unaccountable decision, we lower the bar. Soon, “targeted killing” won’t be the exception—it’ll be the expectation. And once that happens, good luck getting the genie back in the bottle.

(Affirmative Fourth Debater)
Or perhaps, just perhaps, we evolve. From trench warfare to cyber defense, war changes. So must our ethics. To freeze morality in time is to invite defeat—not just militarily, but morally. Because the greatest failure isn’t using force; it’s allowing evil to flourish while we debate the perfect way to stop it.

(Negative Fourth Debater)
But if the means corrupt the end, then victory tastes like ash. We can win every battle and still lose our soul. Drones may reduce bloodshed abroad—but they risk desensitizing us at home. And when conscience goes numb, tyranny doesn’t march in with drums. It clicks in with a mouse.

Closing Statement

In the final moments of a debate, the noise fades, the exchanges settle, and what remains is not merely who argued better—but who mattered more. The closing statement is not a recap; it is a reckoning. It asks: What values are at stake? What future are we choosing? And at what cost does security come?

Both sides now step forward one last time—not to fight, but to frame. To rise above tactics and speak to principles. To answer not only the motion, but the deeper question beneath it: In the shadow of terror and technology, how do we remain human?

Affirmative Closing Statement

We began this debate with a simple premise: ethics demand action when lives are at risk. And throughout this exchange, we have shown that drones, when used with discipline and oversight, are not a departure from morality—but its most advanced expression in modern warfare.

Let us be clear: we do not celebrate killing. We mourn every loss. But we also recognize that in a world where terrorists hide among civilians, strike without warning, and reject all rules, waiting for perfect legality or ideal conditions is not prudence—it is paralysis. And paralysis costs lives.

Our opponents cling to a romanticized vision of justice—one where every threat gets a trial, every border is respected, and violence is only legitimate when declared by treaty. But reality is messier. When a child suicide bomber is being fitted with a vest in a compound in Idlib, and intelligence confirms the attack window is hours away—do we send in 50 soldiers to risk their lives in a raid? Or do we eliminate the threat with a single strike, sparing countless innocents?

Drones allow us to choose the latter. They offer precision unmatched in human history. Yes, mistakes happen—but far fewer than in conventional warfare. Yes, oversight must improve—but that calls for reform, not rejection. To abandon drones because of misuse is like banning fire because someone burned down a house.

They say sovereignty is violated. But whose sovereignty protects children in schools targeted by extremists? Whose law defends villages held hostage by militias? When states fail or collude, the moral duty shifts—to protect the vulnerable, wherever they are.

They say due process is denied. But due process is not a right for those waging war on civilization. Enemy combatants in active conflict zones are not entitled to courtroom trials—they are subject to lawful military action. That has been true since Geneva, since Nuremberg, since the dawn of armed conflict.

And yes, drone operators suffer. But so do all who bear the weight of war. The difference is, with drones, fewer suffer overall. Fewer soldiers die. Fewer civilians bleed. Fewer wars drag on for decades.

This is not about convenience. It is about proportionality. It is about minimizing harm. It is about doing the least damage necessary to stop the greatest evil.

So let us not confuse ethics with inaction. Let us not elevate procedure above protection. The ethical choice is not always the easiest—but sometimes, it is the one that presses the button to save a thousand lives.

We affirm: using drones for targeted killings is not only ethical—it is often the most ethical option available. Not because we love machines, but because we value human life—especially when it hangs by a thread.

Negative Closing Statement

Thank you.

If the affirmative sees a scalpel, we see a shadow—a dark silhouette creeping across the globe, operated by unseen hands, accountable to no court, answering to no public. Drones are not the evolution of ethics. They are the automation of assassination.

We do not dispute that some targets pose real threats. Nor do we suggest that governments should stand idle. But we insist—loudly and clearly—that how we respond defines who we are. You can defeat terrorism and still lose your soul.

The affirmative team has spent this debate normalizing the extraordinary: unilateral strikes in sovereign nations, secret kill lists, identity-based targeting, and the erasure of due process—all justified under the banner of “necessity.” But necessity is the oldest excuse for tyranny. And once we accept that a government can execute anyone, anywhere, based on secret evidence and hidden criteria, we have crossed a line from which there is no return.

They cite precision. Yet their own data hides behind classification. Independent investigations reveal hundreds, even thousands, of civilian deaths—men, women, children—labeled “combatants” by default. Is that precision? Or is it prejudice dressed as policy?

They claim self-defense. But self-defense requires imminence—and “signature strikes” target people not for what they’ve done, but for what they might do. That is not justice. That is profiling with missiles.

And they dismiss our warnings about moral hazard as outdated sentiment. But history remembers such dismissal. Every weapon that made killing easier—gunpowder, machine guns, nuclear arms—was first hailed as “cleaner,” “faster,” “more humane.” And every time, it lowered the threshold for war.

Drones do more than kill. They desensitize. They distance. They make murder routine. A pilot in Nevada watches a screen, hears a pop, and moves on to lunch. There is no gravity. No consequence. No ritual of mourning. Just a checkbox on a spreadsheet.

And when the public never sees the blood, never hears the screams, never questions the list—we lose our ability to resist abuse. Because oversight cannot exist where transparency is absent.

This debate is not just about drones. It is about power. About whether we build systems that constrain it—or ones that conceal it.

We are told, “If you’re not a terrorist, you have nothing to fear.” But that is the logic of dictatorships. Innocent people have been killed by these programs. Families erased. Communities shattered. And no one is held accountable.

So ask yourselves: Do we want a world where any leader, in any country, can declare someone an enemy and erase them without trial? Because if we accept it today from one nation, we cannot condemn it tomorrow from another.

Security without law is not safety. It is subjugation. And a peace built on secrecy and fear is no peace at all.

We urge you: reject the normalization of remote execution. Demand transparency. Uphold due process. Protect the rule of law—even when it is hard.

Because in the end, the measure of a civilization is not how efficiently it kills its enemies, but how fiercely it defends its principles—even in darkness.

We stand opposed—not against defense, but against dehumanization. Not against innovation, but against impunity.

And for that, we say: no, it is not ethical to use drones for targeted killings. It never was. And if we allow it to continue, we may soon forget why it mattered.