Cisco Automation course – 029 -Construct a Python script that calls a REST API using the requests library
Watch Full Demo on YouTube:
Introduction
When students first approach API automation, it is easy to reduce the task to one line of code: import requests, call a URL, print the output, and move on. In reality, useful automation is more disciplined than that. A script needs to know which endpoint to call, which method belongs to the operation, whether headers or credentials are required, what success looks like, how the server encodes its response, and what should happen when the remote platform is slow, unreachable, or returns something unexpected.
That is exactly why this chapter matters. Constructing a Python script that calls a REST API is not just about syntax. It is about translating API documentation into working behaviour. In a network automation context, the same pattern appears again and again: query inventory, create an object, update a configuration resource, delete a stale record, or poll for the status of an operation. If you can build that pattern once, cleanly and safely, you can reuse it across many APIs later.
For this walkthrough, the example API is JSONPlaceholder. It is ideal for learning because it exposes familiar resources such as posts, comments, users, and todos, and it accepts the common HTTP methods you need for CRUD-style practice. At the same time, it stays simple enough that you can focus on the client code rather than the server platform. The goal is not only to make requests, but to understand the entire request-response workflow in Python using requests.
Import and install requests
The first step is basic but important: make sure the library exists in the environment where you will run the script. In a learning environment, it is worth creating a virtual environment before you install anything. That keeps package versions isolated and makes your lab easier to reproduce later.
A simple setup looks like this:
bashpython -m venv .venv
# Linux/macOS
source .venv/bin/activate
# Windows PowerShell
# .\.venv\Scripts\Activate.ps1
python -m pip install requests
Once the package is installed, importing it is straightforward:
pythonimport requests
The key teaching point here is that installation is not just a box-ticking exercise. In real automation work, reproducibility matters. If your API script works only on one laptop with one mystery set of packages, it is fragile from the start. Using a virtual environment helps keep the learning lab close to real engineering practice.
Define the API URL
After importing the library, define the API location in a way that keeps the script readable. Hardcoding full URLs throughout the file makes later changes painful. A better pattern is to keep a base URL in one constant and then add paths as needed.
pythonBASE_URL = "https://jsonplaceholder.typicode.com"
From there, a specific resource becomes easy to express:
pythonpost_url = f"{BASE_URL}/posts/1"
This may seem like a trivial convenience, but it is actually good automation hygiene. In network APIs, the hostname, version prefix, or path structure often changes between lab, staging, and production systems. If the base URL is centralised, the script becomes far easier to adapt.
It is also useful to distinguish path parameters from query parameters. Paths identify the resource itself, while query parameters refine what you want back. For example, /posts/1 asks for one post, while /posts?userId=1 filters the collection. In Python, it is cleaner to pass query parameters as a dictionary rather than manually building them into the URL string.
Optional headers
Headers are often the place where students first realise that “calling an API” is really an HTTP conversation. Headers tell the server what format you want, who the client is, what content type is being sent, and sometimes how the request is authorised.
For a JSON API, two headers usually deserve attention. The first is Accept, which tells the server that your client expects JSON back. The second is User-Agent, which helps identify the client in server logs and can make troubleshooting easier.
pythonheaders = {
"Accept": "application/json",
"User-Agent": "ccna-auto-requests-demo/1.0",
}
If you send JSON in the request body, do not rush to type the Content-Type header manually. In requests, the cleaner approach is to use the json= argument when sending a payload. That way, the library serialises the Python dictionary for you and sets the JSON content type appropriately. Manual headers remain useful, but they should solve an actual requirement from the API documentation rather than being copied blindly.
Another useful habit is to keep default headers at the session level and add method-specific headers only when necessary. That creates cleaner code and reduces repetition.
Optional authentication
JSONPlaceholder does not require authentication, which keeps the example simple. Real controller APIs usually do. That means an honest chapter 2.9 explanation should still show students where authentication fits into the script, even if the sample API leaves the field empty.
For HTTP Basic Authentication, requests supports the auth= parameter directly:
pythonresponse = requests.get(
"https://example-api.local/resource",
auth=("username", "password"),
timeout=10,
)
For bearer tokens, you usually place the token in the Authorization header:
pythontoken = "replace-with-real-token"
headers = {
"Accept": "application/json",
"Authorization": f"Bearer {token}",
}
For API keys, the exact header name depends on the platform. A common pattern looks like this:
pythonapi_key = "replace-with-real-api-key"
headers = {
"Accept": "application/json",
"X-API-Key": api_key,
}
In CCNA Automation-style labs, the conceptual lesson matters more than the syntax. Students should learn that credentials are part of the request contract, that they belong outside the main code path where possible, and that they should never be printed casually during debugging. A good beginner habit is to load tokens, usernames, passwords, or API keys from environment variables instead of hardcoding them.
Send the HTTP request
Once the URL and optional headers are ready, the real work begins. HTTP methods express intent. GET asks for data. POST creates something new. PUT replaces a resource. DELETE removes one. In this chapter, understanding that mapping is more important than memorising the spelling of each function call.
A simple GET call looks like this:
pythonimport requests
BASE_URL = "https://jsonplaceholder.typicode.com"
response = requests.get(
f"{BASE_URL}/posts/1",
headers={"Accept": "application/json"},
timeout=10,
)
print(response.status_code)
print(response.json())
A POST call submits a JSON payload to create a new resource:
pythonpayload = {
"title": "CCNA Automation demo",
"body": "Created with Python requests.",
"userId": 1,
}
response = requests.post(
f"{BASE_URL}/posts",
json=payload,
headers={"Accept": "application/json"},
timeout=10,
)
print(response.status_code)
print(response.json())
A PUT call replaces the target resource:
pythonpayload = {
"id": 1,
"title": "Updated title",
"body": "Updated body",
"userId": 1,
}
response = requests.put(
f"{BASE_URL}/posts/1",
json=payload,
headers={"Accept": "application/json"},
timeout=10,
)
print(response.status_code)
print(response.json())
A DELETE call removes the resource:
pythonresponse = requests.delete(
f"{BASE_URL}/posts/1",
headers={"Accept": "application/json"},
timeout=10,
)
print(response.status_code)
print(response.text)
This is the moment where JSONPlaceholder’s behaviour becomes a teaching point. It accepts the write methods and returns responses that look like create, update, and delete operations, but it does not truly persist them. That is not a flaw for this chapter. It actually helps students concentrate on how the client behaves: method selection, payload format, status checking, and response parsing.
Receive the response
Every requests call returns a Response object. That object is the centre of the script’s decision-making. It holds the HTTP status code, headers, body, and metadata about the request. Good client code does not jump straight to response.json(); it first considers what the response is telling you overall.
In practice, the first questions are simple. Did the server respond at all? Which status code came back? What content type does the body claim to have? Is there even a body to parse?
That is why response handling should feel sequential rather than magical. The script sends a request, receives a response, confirms whether the response matches expectations, and only then reads the payload. That is a much safer pattern than assuming every successful network exchange returns valid JSON with the exact keys you hoped for.
It is also worth remembering that requests is synchronous and blocking. The function call waits for the response or for a timeout. From a teaching perspective, that is useful because it makes the request flow easy to reason about. It also means that poorly configured scripts can hang if they omit timeouts.
Check the status code
Status-code handling is where beginner scripts become reliable scripts. The wrong pattern is to assume that anything returning JSON is automatically fine. The better pattern is to treat the status code as the primary signal, and the body as supporting information.
At the simplest level, you can call raise_for_status():
pythonresponse.raise_for_status()
That causes Requests to raise an exception for unsuccessful 4xx and 5xx responses. For many scripts, that is already a major improvement over blindly printing the body.
Still, chapter 2.9 should go one step further. Real APIs often use different success codes for different operations. A GET is usually 200 OK. A POST is often 201 Created. A DELETE may return 200 OK with a body or 204 No Content with nothing in the body. So the safest pattern is often:
pythonresponse.raise_for_status()
if response.status_code not in {200, 201, 204}:
raise ValueError(f"Unexpected success code: {response.status_code}")
That combination matters because raise_for_status() catches obvious failure, while exact status checking catches mismatches in expected behaviour. For automation, this is valuable. If your script expects a create operation to return 201 but gets a plain 200, that might still be technically successful, but it is worth noticing because the API contract may be different from what you believed.
Read JSON data
Only after the status code has been checked should the script parse the body as JSON. In Requests, that typically means:
pythondata = response.json()
This convenience is one reason the library is so popular, but it should not be treated as magic. JSON decoding can fail. A response may be empty. A server may return HTML, plain text, or a body that claims to be JSON but is malformed.
A safer pattern is to combine content-type awareness with JSON parsing:
pythoncontent_type = response.headers.get("Content-Type", "")
if response.status_code != 204 and "application/json" in content_type.lower():
data = response.json()
else:
data = None
For students, there is another important lesson here: decoding JSON is not the same thing as validating its shape. Even if response.json() succeeds, the result may still be missing fields you require. For example, if you expect a post object, then id, userId, title, and body should all exist and have sensible types. That validation step is especially important in automation, where downstream logic may break if the input structure changes.
Handle errors cleanly
The final step in the basic lifecycle is error handling. This is where a script becomes trustworthy. A clean failure is more useful than a confusing partial success.
In Requests-based code, the most common exceptions to think about are timeout problems, connection problems, HTTP status failures, and JSON decoding issues. A simple pattern looks like this:
pythonimport requests
try:
response = requests.get("https://jsonplaceholder.typicode.com/posts/1", timeout=10)
response.raise_for_status()
data = response.json()
print(data)
except requests.exceptions.Timeout:
print("The request timed out.")
except requests.exceptions.ConnectionError:
print("The API could not be reached.")
except requests.exceptions.HTTPError as exc:
print(f"HTTP error: {exc}")
except requests.exceptions.RequestException as exc:
print(f"General requests error: {exc}")
That is a good beginner structure because it separates the major failure classes without overwhelming the reader. From there, the next improvement is logging, so that the script records which URL was called, which method was used, and which status code came back.
Another worthwhile habit is to avoid swallowing exceptions silently. If an automation script fails to reach an API, the operator should know exactly whether the problem was DNS, connectivity, permissions, payload quality, or response parsing. “Something went wrong” is not enough in real troubleshooting.
Conclusion
Constructing a Python script that calls a REST API is really about building a reliable workflow: prepare the environment, define the endpoint, add the right headers and optional authentication, choose the correct HTTP method, receive and inspect the response, validate success, decode JSON safely, and handle failure in a way that is useful for troubleshooting.
That is why requests is such a good fit for beginners and early automation students. It makes the happy path readable, but it also gives you the tools to write better code when the happy path breaks. If you learn chapter 2.9 properly, you are not just learning how to call JSONPlaceholder. You are learning a reusable template for controller APIs, SaaS APIs, and many of the platform interfaces you will automate later.
References
Cisco Certified DevNet Expert (v1.0) Equipment and Software List
How to Build an API Request in 3 Ways
- Official Cisco exam-topic and study-plan material confirming the objective wording for constructing a Python script that calls a REST API with Requests, and showing its placement beside request construction, response codes, response anatomy, authentication, and API-style comparison.
- JSONPlaceholder home page and guide for supported resources, routes, filtering, nested resources, and the important caveat that write operations are faked rather than truly persisted.
- Requests developer interface and quickstart for supported methods, request parameters, headers,
json=,Responseusage,raise_for_status(), JSON decoding, timeouts, sessions, and exceptions. - Requests advanced usage for session behaviour, connection pooling, transport adapters, automatic retries, blocking behaviour, timeout guidance, and TLS verification defaults.
- urllib3 Retry documentation for
allowed_methods,status_forcelist, exponential backoff,Retry-Afterhandling, and the default focus on idempotent methods. - RFC 9110 and MDN HTTP status references for idempotent methods and the semantics of 201, 202, 204, 401, 403, 409, and 429.
- Python documentation for virtual environments, logging, JSON decoding errors, and
unittest.mock.patch, which support the installation, logging, validation, and test-stub sections of the report. - Requests installation reference for the standard
python -m pip install requestscommand. - Official repository issue discussions for JSONPlaceholder caveats around negative testing and DELETE behaviour, used here to justify treating it as a mechanics-first demo API rather than a perfect model of production error contracts.



