At SEIUM, technology is never neutral: it is designed, tested, and transferred within an explicit ethical, legal, and human rights framework. This policy establishes what we do, how we do it, and what we do not do when working in sensitive areas—including academic defense, civil security, autonomy, and dual-use—to ensure that teaching and research are non-operational, socially responsible, and in compliance with International Humanitarian Law (IHL) and international human rights standards.
Ensure that all academic, research, and external engagement activities comply with international humanitarian law, human rights, applicable laws, and internal policies; minimize the risk of misuse; and protect individuals, groups, and the public interest.
Students, faculty, technical staff, administrative staff, adjuncts, fellows, guests, contractors, and partners (companies, institutions, and government agencies). Applies to content, data, software, equipment, research, publications, and technology transfer.
We do not teach or develop tactics, doctrine, or operational TTPs (tactics, techniques, and procedures); instead, we focus on engineering, security, certification, ethics, and validation.
We exclude the design, optimization, or training related to offensive capabilities, lethality, weapon guidance, the exploitation of vulnerabilities outside legal frameworks, or the deliberate circumvention of regulatory safeguards.
The principles of distinction, proportionality, and precaution under international humanitarian law; the UN Guiding Principles on Business and Human Rights; the OECD Due Diligence Guidelines; and applicable national and international regulations.
Training in offensive capabilities, lethality, weapon guidance, the exploitation of vulnerabilities outside legal frameworks, or the deliberate circumvention of regulatory safeguards.

Preventive assessment of human, social, and environmental risks.

compliance with regulations + explicit ethical justification (“not everything that is legal is legitimate”).

Meaningful human oversight in autonomous and decision-making systems.

Documented decisions; audits conducted by internal committees and, where appropriate, external committees.

data minimization, privacy by design, protection of vulnerable groups.

Control of biases, applicability, and reasonable access to recourse rights.

the minimum necessary for the legitimate academic/scientific purpose.

clear responsibilities, whistleblowing channels, and corrective measures.
• Initial screening (checklist on sensitivity, human rights, and export). • Draft Human Rights Impact Assessment (HRIA) if it affects individuals or groups.
• Ethics Dossier: purpose, benefits, risks, mitigation measures, data, stakeholders, applicable standards. • Classification (green/yellow/red) by CEUD and requirements (e.g., synthetic data).
• Mitigation plan: anonymization/pseudonymization, clean rooms, need-to-know segregation, export controls, rate limiting, watermarking, model cards. • DPIA/PIA if personal data is involved; de-biasing and validation plan.
• Progress gates with CEUD/HSE/OCE reviews. • Logging and audit trails; kill switches to stop tests.
• Publication review: what is published, how, and with what safeguards. • Licenses, terms of use, and disclaimers.
• Final report, ethical review, data retention/disposal plan, lessons learned.
Design/optimization of lethal effects or weapon guidance.
Exploitation of vulnerabilities (cyber/OT) outside of legally authorized programs and without proper coordination.
Exploitation of vulnerabilities (cyber/OT) outside of legally authorized programs and without proper coordination.
Exploitation of vulnerabilities (cyber/OT) outside of legally authorized programs and without proper coordination.
Exploitation of vulnerabilities (cyber/OT) outside of legally authorized programs and without proper coordination.
Principles, borderline cases, warning signs, red flags.
DPIA, anonymization, biases, explainability, ARCO/DSAR rights.
Classification, counterparty screening, licenses, cash flows.
Safety procedures, substances/equipment, incident response.
Best practices, secure-by-design, incident response.
All staff and students participating in sensitive projects must keep their training up to date; failure to do so will result in the suspension of access to resources and repositories.
Preference for transparency when it does not jeopardize people, security, or compliance.
summary data, aggregated results, omission of critical parameters.
Purpose, limitations, intended use, out-of-scope, residual risks.
Avoid overpromising; put risks and safeguards into context.
We believe that technical excellence without accountability is not true excellence. For this reason, all teaching and research is subject to a robust framework of ethics, international humanitarian law (IHL), and human rights, featuring clear red lines, independent committees, auditable processes, and a “stop the line” culture. Our commitment is to educate and develop safe, legitimate, and socially beneficial engineering—from the road to orbit—without crossing the boundaries that protect life, dignity, and the rule of law.