aiohttp Proxy Setup
aiohttp is Python's leading asynchronous HTTP client/server library, built on asyncio. It enables high-throughput, non-blocking HTTP requests — ideal for concurrent web scraping and data collection at scale. aiohttp supports proxy configuration per request, making it straightforward to integrate with Hex Proxies.
Why Use aiohttp with Proxies?
aiohttp can send hundreds of concurrent requests using a single thread, but without proxy rotation, all requests share your server's IP. Target sites detect the burst pattern and block your IP quickly. Hex Proxies' rotating residential pool assigns a different IP per connection, allowing your async scraper to maintain high success rates even at high concurrency.
Basic Proxy Configuration
import aiohttpasync def fetch(url): async with aiohttp.ClientSession() as session: async with session.get( url, proxy='http://user:pass@gate.hexproxies.com:8080', timeout=aiohttp.ClientTimeout(total=30), ) as resp: return await resp.text()
html = asyncio.run(fetch('https://httpbin.org/ip')) print(html) ```
Concurrent Requests with Proxy Rotation
import aiohttpasync def fetch_all(urls): connector = aiohttp.TCPConnector(limit=20) # Max concurrent connections async with aiohttp.ClientSession(connector=connector) as session: tasks = [] for url in urls: tasks.append(session.get( url, proxy='http://user:pass@gate.hexproxies.com:8080', timeout=aiohttp.ClientTimeout(total=30), )) responses = await asyncio.gather(*tasks, return_exceptions=True) return responses ```
IP Whitelist Authentication
Whitelist your server IP in the Hex Proxies dashboard and omit credentials from the proxy URL:
proxy='http://gate.hexproxies.com:8080'Best Practices
- **Limit concurrency** using `TCPConnector(limit=N)` or `asyncio.Semaphore` to avoid overwhelming target sites. Start with 10-20 concurrent requests and scale up.
- Combine with rotation — Hex Proxies' rotating pool assigns a new IP per connection by default, which pairs well with aiohttp's concurrent request model.
- **Add retries with backoff** using the `aiohttp-retry` package or custom exception handling in your gather loop.
- **Set explicit timeouts** via `aiohttp.ClientTimeout` to prevent coroutines from hanging indefinitely.
Troubleshooting
- **aiohttp.ClientProxyConnectionError**: Verify proxy host, port, and credentials. Check that outbound port 8080 is open.
- **TimeoutError**: Residential proxies add latency. Increase `ClientTimeout(total=30)` or higher for slow targets.
- **Too many open files**: Increase your OS file descriptor limit (`ulimit -n 4096`) when running high concurrency.
- **SSL certificate errors**: Update certifi (`pip install --upgrade certifi`) or pass `ssl=False` for testing (not recommended in production).