Agents Flashcards
(20 cards)
What is an agent?
An agent is anything that perceives its environment through sensors and acts on it through actuators.
Example: A robot vacuum senses dirt and moves to clean it.
What is an agent function?
A mapping from percept sequences to actions — it defines how the agent decides what to do.
Example: If the agent sees [A, Dirty], it decides to Suck.
What is a rational agent?
An agent that selects actions expected to maximize its performance measure, based on percepts and knowledge.
Example: A self-driving car drives safely and efficiently to its destination while avoiding traffic.
What is a fully observable environment?
An environment where the agent has access to the entire state at any time.
Example: In chess, both players can see the full board.
What is a partially observable environment?
An environment where the agent only has partial or limited access to the full state.
Example: A robot navigating a maze can’t see around walls — it only knows what its sensors detect.
What is a deterministic environment?
An environment where the next state is completely predictable, based on the current state and action.
Example: In Sudoku, placing a 5 in a cell has a fixed outcome
What is a stochastic environment?
An environment that includes randomness — actions may lead to different results.
Example: In League of Legends, an attack may or may not critically strike due to chance.
What is an episodic environment?
An environment where each decision is independent of others.
Example: A spam filter processes each email separately.
What is a sequential environment?
An environment where current decisions affect future ones.
Example: In Sudoku, placing a number limits future choices in that row/column/box.
What is a static environment?
An environment that does not change while the agent is deciding.
Example: Chess — the board doesn’t change while you think.
What is a dynamic environment?
An environment that can change on its own, even while the agent is thinking.
Example: In real-world driving, traffic may change suddenly.
What is a discrete environment?
One with a finite set of states and actions.
Example: In TFT unit placement, you can only place units on specific grid tiles.
What is a continuous environment?
One where states/actions exist on a spectrum and can take on real values.
Example: In League of Legends, player movement is smooth and not limited to a grid.
What is a single-agent environment?
One where only one agent makes decisions.
Example: Solving a maze alone — no other agents to interact with.
What is a multi-agent environment?
One with two or more agents interacting — possibly competing or cooperating.
Example: Chess — both players are agents making decisions.
What is a simple reflex agent?
Reacts based only on the current percept using condition-action rules.
Example: A vacuum agent that does if Dirty then Suck.
What is a model-based reflex agent?
Like a simple reflex agent, but it keeps an internal state to deal with partially observable environments.
Example: A vacuum that remembers which rooms it already cleaned.
What is a goal-based agent?
Chooses actions to achieve a defined goal, often using planning.
Example: A robot navigating to a charging station.
What is a utility-based agent?
Chooses actions that maximize utility, not just achieve a goal.
Example: Google Maps choosing a route that avoids traffic even if it’s longer in distance.
What is a learning (autonomous) agent?
Learns from experience to improve its behavior in changing or unknown environments.
Example: A recommendation system that learns from user clicks to show better content.