Go Web Scraping Quick Start Guide
上QQ阅读APP看书,第一时间看更新

Go is simple

Beyond the architecture of the Go programming language itself, the standard library offers all the right packages you need to make web scraping easy. Go offers a built-in HTTP client in the net/http package that is fully-featured out of the box, but also allows for a lot of customization. Making an HTTP request is as simple, as follows:

http.Get("http://example.com")

Also a part of the net/http package are utilities to structure HTTP requests, HTTP responses, and all of the HTTP status codes, which we will dive into later in this book. You will rarely need any third-party packages to handle communication with web servers. The Go standard library also has tools to help analyze HTTP requests, quickly consume HTTP response bodies, and debug the requests and responses in your web scraper. The HTTP client in the net/http package is also very configurable, letting you tune special parameters and methods to suit your specific needs. This typically will not need to be done, but the option exists if you encounter such a situation.

This simplicity will help eliminate some of the guesswork of writing code. You will not need to determine the best way to make an HTTP request; Go has already worked it out and provided you with the best tools you need to get the job done. Even when you need more than just the standard library, the Go community has built tools that follow the same culture of simplicity. This certainly makes integrating third-party libraries an easy task.