2026 UMD Agentic AI Challenge

Registration Opens
March 17, 2026

Submissions Due
April 10, 2026

Final Demo
April 24, 2026

The 2026 UMD Agentic AI Challenge invites student teams to design, build and test autonomous AI agents that solve real business, analytical and operational problems. Over four weeks, teams will prototype and deploy intelligent systems focused on measurable outcomes, cost-efficiency, reliability and safety.

This competition connects cutting-edge AI research with practical engineering and responsible innovation, preparing students for the next frontier of agentic AI.

Register Your Team Problems and Guidelines

What teams will do

Teams will create autonomous agentic AI systems that are not only innovative, but also practical, robust, and aligned with real organizational constraints.

  1. Design an autonomous AI solution to address a meaningful business or operational challenge.
  2. Prototype and deploy the Agentic AI system under real-world constraints.
  3. Demonstrate performance through live evaluation.
  4. Present results and explain system performance, cost considerations, and safety measures.

Instructions for Participation

How to Participate

The challenge is open to all UMD student teams ready to build agentic systems that can perform under realistic busines and operational conditions.

Step 1

Form a team of 4 or 5 UMD students and register your team. Registration opens on March 17, 2026.

Step 2

Choose one of the provided challenge problems or propose your own. Any self-defined problem must be grounded in a real-world business context.

Step 3

Use a real or synthetic dataset, including data generated with an LLM if needed, to design and test your solution.

Step 4

Prepare a minimum viable solution, up to a maximum of 5 pages, and submit it using the submission link. Review the evaluation guidelines below before submitting.

Evaluation Guidelines

What Judges Will Look For

Industry partners and faculty members will evaluate submissions with a strong emphasis on solutions that work efficiently in practice. Cost efficiency, reliability, and safety will play a significant role throughout the review process.

Organizers: Professors Manmohan Aseri and Jessica Clark

For any questions, please reach out to Kunal Roy Chowdhury.

Criterion 1

Estimation of timeline and cost in the real project.

Criterion 2

Clear, specific ROI or outcome for the firm.

Criterion 3

Unintended consequences or ripple effects considered.

Criterion 4

Upstream and downstream dependencies mapped in a real firm.

Timeline

Week 1 - April 3

Proposal Submission

Week 2 - April 10

Screening Result (10 Teams Will be Selected)

Week 3 - April 17

Selected Teams Submit the Working Product

Week 4 - April 24

Final Demo and Judging (Tyser Auditorium, Room 1212)

Back to Top