code-from-vague-spec
Working code from a paragraph of spec
Write a Laravel artisan command pexels:fetch that takes --query and --count, hits the Pexels API, downloads images to storage/app/pexels/, retries failed downloads twice with exponential backoff, and respects the 200/min rate limit. Use Guzzle. Save photo IDs and credit to a CSV alongside the images. Do not depend on a third-party Pexels SDK.
<?php
namespace App\Console\Commands;
use GuzzleHttp\Client;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Storage;
class FetchFromPexels extends Command
{
protected $signature = 'pexels:fetch {--query=} {--count=10}';
protected $description = 'Fetch images from Pexels with retry and rate-limit awareness.';
private const ENDPOINT = 'https://api.pexels.com/v1/search';
private const RATE_LIMIT_PER_MIN = 200;
private const MAX_RETRIES = 2;
public function handle(): int
{
$query = $this->option('query');
$count = (int) $this->option('count');
if (! $query || $count < 1) {
$this->error('--query is required and --count must be >= 1');
return self::FAILURE;
}
$http = new Client(['timeout' => 30]);
$apiKey = env('PEXELS_API_KEY');
if (! $apiKey) {
$this->error('PEXELS_API_KEY is missing.');
return self::FAILURE;
}
Storage::disk('local')->makeDirectory('pexels');
$csv = fopen(storage_path('app/pexels/index.csv'), 'a');
if (filesize(storage_path('app/pexels/index.csv')) === 0) {
fputcsv($csv, ['id', 'photographer', 'src', 'saved_path']);
}
$perBatch = min(80, $count);
$remaining = $count;
$page = 1;
$minIntervalMs = (int) ceil(60_000 / self::RATE_LIMIT_PER_MIN);
while ($remaining > 0) {
$perPage = min($perBatch, $remaining);
$start = microtime(true);
$resp = $http->get(self::ENDPOINT, [
'headers' => ['Authorization' => $apiKey],
'query' => ['query' => $query, 'per_page' => $perPage, 'page' => $page],
]);
$payload = json_decode((string) $resp->getBody(), true);
foreach ($payload['photos'] ?? [] as $photo) {
$bytes = $this->downloadWithRetry($http, $photo['src']['large2x']);
if (! $bytes) continue;
$filename = "pexels/{$photo['id']}.jpg";
Storage::disk('local')->put($filename, $bytes);
fputcsv($csv, [$photo['id'], $photo['photographer'], $photo['src']['large2x'], $filename]);
$remaining--;
if ($remaining === 0) break;
}
$page++;
$elapsedMs = (microtime(true) - $start) * 1000;
if ($elapsedMs < $minIntervalMs) {
usleep((int) (($minIntervalMs - $elapsedMs) * 1000));
}
}
fclose($csv);
$this->info("Saved {$count} photos to storage/app/pexels/.");
return self::SUCCESS;
}
private function downloadWithRetry(Client $http, string $url): ?string
{
for ($i = 0; $i <= self::MAX_RETRIES; $i++) {
try {
$resp = $http->get($url, ['timeout' => 20]);
if ($resp->getStatusCode() === 200) {
return (string) $resp->getBody();
}
} catch (\Throwable $e) {
if ($i === self::MAX_RETRIES) {
$this->warn("download failed after retries: {$e->getMessage()}");
return null;
}
usleep((int) ((2 ** $i) * 500_000));
}
}
return null;
}
}
Synthesizing a working command from a vague spec, integrating four orthogonal concerns (retry with exponential backoff, rate limiting, CSV writing, file I/O via a Laravel disk abstraction) coherently. Idiomatic Laravel/PHP style. Sensible defaults where the spec is silent.
That the code is bug-free. The retry math looks right; whether the rate-limit math actually respects 200/min under bursty conditions is something a reader has to test. Models are excellent at code that looks right and reads cleanly. They are less reliable at code that survives an adversary.