Sending 87 requests to get 87 results from a paginated API is very inefficient.
You should also never hard-code "magic numbers" like 87 in your code, especially when interacting with an API you don't control. If people get added to or removed from the list, your code will no longer work correctly.
Note that /people/
provides a next
field that gives you a link to the next page. You can therefore use an asynchronous while
loop to get all the results in just 9 requests (10 results per page).
Something like the following — you can swap out native fetch
for axios
if required:
;(async () => {
let nextPage = 'https://swapi.py4e.com/api/people/'
let people = []
while (nextPage) {
const res = await fetch(nextPage)
const { next, results } = await res.json()
nextPage = next
people = [...people, ...results]
}
console.log(people.length)
console.log(people)
})()
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…