Python Proxy Setup
Python is the most popular language for web scraping and data collection, with three major HTTP libraries that each handle proxies differently. The requests library uses a dictionary-based proxy configuration, urllib relies on environment variables or ProxyHandler objects, and httpx supports both sync and async proxying through a unified client interface. Understanding these differences lets you pick the right tool for your workload.
This guide walks you through configuring Python to route HTTP and HTTPS requests through Hex Proxies, including installation, authentication, error handling, and production best practices.
Prerequisites
Before you begin, make sure you have: - An active Hex Proxies account with proxy credentials - Python 3.8+ - pip package manager - requests or httpx library
Installation
pip install requestsBasic Proxy Configuration
The requests library accepts a proxies dictionary mapping protocol schemes to proxy URLs. Both HTTP and HTTPS traffic should point to gate.hexproxies.com:8080. The library handles CONNECT tunneling for HTTPS automatically.
proxies = { "http": "http://user:pass@gate.hexproxies.com:8080", "https": "http://user:pass@gate.hexproxies.com:8080" }
# Basic GET request through proxy response = requests.get("https://httpbin.org/ip", proxies=proxies, timeout=30) print(response.json())
# Session-based proxy (reuses connection) session = requests.Session() session.proxies.update(proxies) session.headers.update({"User-Agent": "Mozilla/5.0"})
for url in ["https://example.com", "https://httpbin.org/headers"]: resp = session.get(url, timeout=30) print(resp.status_code) ```
Session Objects vs Per-Request Proxies
Python's requests.Session() reuses TCP connections across multiple requests, reducing handshake overhead by 30-50ms per request. For scraping workflows that make hundreds of sequential requests, sessions cut total execution time significantly. Per-request proxy dictionaries create a new connection each time, which is useful when you need a fresh IP on every call but adds latency.
urllib and httpx Alternatives
For stdlib-only deployments, urllib.request.ProxyHandler works without any pip dependencies. For async workloads, httpx.AsyncClient supports the same proxy URL format and integrates with asyncio event loops natively.
Configuration Options
- **Proxy URL** -- http://user:pass@gate.hexproxies.com:8080 with credentials embedded in the URL string.
- **Timeout** -- Always pass timeout=30 to requests.get() to prevent indefinite hangs on unresponsive destinations.
- **SSL Verification** -- Keep verify=True (the default) in production. Disabling it masks certificate errors that could indicate MITM attacks.
- **Connection Pooling** -- Use requests.Session() to reuse connections. The default pool size is 10; increase via HTTPAdapter for high-concurrency workloads.
- **Retry Logic** -- Mount a urllib3.util.retry.Retry adapter on your session for automatic retries with backoff.
Error Handling
Python's requests library raises specific exceptions for proxy-related failures that you should catch individually.
- requests.exceptions.ProxyError
- - Raised when the proxy server is unreachable or rejects the connection
- - Verify gate.hexproxies.com:8080 is reachable from your network
- - Check that your credentials are URL-encoded if they contain special characters like @ or #
2. requests.exceptions.ConnectTimeout - The proxy accepted the connection but the destination did not respond - Increase timeout from 30 to 60 seconds for slow destinations - Check if the destination is blocking the proxy IP and try a different session
3. requests.exceptions.SSLError - Certificate verification failed during the HTTPS CONNECT tunnel - Update your certifi package: pip install --upgrade certifi - Never set verify=False in production; it silently bypasses security
4. HTTP 407 Proxy Authentication Required - Your credentials were rejected by the proxy gateway - Double-check username and password on the Hex Proxies dashboard - Ensure special characters in credentials are percent-encoded
5. HTTP 429 Too Many Requests - The destination is rate-limiting your proxy IP - Add time.sleep() delays between requests (1-3 seconds) - Switch to per-request rotation to distribute load across IPs ```
Always wrap proxy requests in try/except blocks and log the exception type for debugging.