Class Overview
Why is this important?
As students advance their team automation projects, it becomes critical to identify and understand the broader technology landscape required to build scalable, intelligent solutions. This class is designed to bridge knowledge gaps by introducing and contextualizing essential technologies that support integration, decision-making, and enterprise-level deployment. The motivation is to ensure that each team is equipped not only with automation tools but also with the relevant infrastructure, AI capabilities, and design frameworks necessary to deliver viable intelligent automation systems.
What will we do?
This session provides a structured overview of the additional technologies students are likely to encounter during their Intelligent Automation Team Challenge. Topics will include API integration, document intelligence, generative AI prompt engineering, multi-agent coordination, and ESG disclosure automation. The class will also review examples of how these technologies can be combined in modular frameworks using UiPath Cloud, GenAI services, and orchestration tools. In the second half of class, teams will apply these insights by continuing work on their projects with a focus on refining architecture and exploring new tool integrations.
How this relates to other classes:
We will begin by revisiting each team 's current architectural draft, emphasizing the role of intelligent features and integration logic. A brief recap of the technologies already in use such as UiPath and generative AI will be used to anchor the discussion before introducing new topics. This review ensures continuity from the prior workshop while laying the groundwork for expansion into more complex automation environments.
Teams are expected to identify at least one new technology or integration point to investigate for inclusion in their automation project. In upcoming sessions, students will begin validating these technologies through prototyping and testing. Deliverables should include clear documentation of how each intelligent or integrated component contributes to the overall solution. These explorations will inform the final presentation and submission of the Intelligent Automation Team Challenge, where teams must demonstrate both functional outputs and architectural clarity.
Materials and Preparation
Class Materials
- Case: Intelligent Automation Team Challenge Case"> Intelligent Automation Team Challenge Case
- Case: Technology for Good: Common Spring Project Case"> Technology for Good: Common Spring Project Case
- Slides: PowerPoint or PDF
- Analytics Tools: Business Process Modelling (BPM) software.
- Automation Tools: UIPath: Cloud, Maestro, Studio Web, Application Programming Interface (API), Gen AI Tools: Chat GPT
- Suggested in-class seating: assessment teams
-
Suggested Pre-Class Preparation
There is no suggested preparation for this class. -
Class Plan
- The beginning of class will include brief review and admin update, as well as a review of the Intelligent Automation Team Challenge Requirements.
- The remainder of class time is for team work on the Intelligent Automation Team Challenge. Teams should be prepared to check in and provide an update on the project and any anticipated challenges.
Additional Generative AI Materials
To reinforce the generative AI materials covered in this three class module, I have curated a set of activities that can be used to explore the capabilities of generative AI. These activities are designed to be engaging and informative, providing students with hands-on experience in using generative AI tools. The activities can be found on the EYARC Experience website. To access the EYARC Experience you will need to sign up using an email, your UW NetID and the course code 11401-70454-29527. Instructions for logging on can also be found in this pdf.
The Experience site offers three modules, Introduction to Gen AI, Prompt Engineering (revision from our data analytics course), and a new Gen AI Governance module. I expect that everyone will cover the Governance Module in preparation for the two team projects (one person per team as a minimum). With a deadline of May 29th, any attempts made on the quizzes on the EYARC Experience platform will count towards professionalism.
In addition, I also recommend working through the Gandalf Gen AI Security Game by Lakera AI, that provides an interactive way of thinking about security related issues with prompt engineering. For our purposes, this game will help you think about how system prompts can help in establishing better responses from Gen AI, which is important when we are relying on it within an Agentic Automation framework. Submissions by May 22nd will count towards individual professionalism scores. How far you progress is not important, submit a screenshot to canvas of your progress.
Required Deliverables
| Deliverable | Due Date | Canvas Submission Portal |
|---|---|---|
| Professionalism (individual): Screen shot of Gandalf Progress | May 22nd, 2025 | Upload to Canvas |