Local S3 Storage with MinIO

Recently, I had a project where I needed to copy files from a sFTP location, process them, and store them on AWS S3. Using a production S3 environment for testing isn't ideal, especially considering security, creating separate S3 buckets could be costly and cumbersome to manage credentials, so I needed something different.

On the hunt for something local, S3 API compatible, and uses Docker, I found MinIO. First and foremost they are a service for high performance object storage, but they offer self-hosted versions too.

Getting started

I headed over to their documentation and found they have a Docker solution, which is perfect for my needs. I followed the Docker (Rootfull) guide, and soon enough I had a new Docker container ready to go. From here I opened up the browser console (http://localhost:9090/browser), logged in with the credentials I set in the guide, and began configuring.

First thing we need to is navigate to Administrator->Identity->Users, and create a User with the readwrite policy. Next we select the user we have just created, navigate to Service Accounts, click Create Access Key, then make sure to copy the credentials before hitting Create.

Next we navigate to Administrator->Buckets, click Create Bucket, fill out the Bucket Name, and feel free to leave all the features Off. After saving the bucket, we click on it, and navigate to the Summary tab, and make sure we set the Access Policy to private. Finally to make sure we have access, within our bucket, we navigate to Access->Users, and ensure our user is listed.

Accessing buckets

Web Access:
The MinIO Console has a Object Browser built in, which allows upload, download, and deleting files and folders.

PHP / Laravel:
If were accessing the storage via PHP, one of the best solution is using thephpleague/flysystem package, this comes included with the Laravel Filesystem. Within Laravel we can use the pre-defined AWS config in the .ENV for MinIO, using the Access Key and Secret Key we copied earlier.

AWS_ACCESS_KEY_ID=minio_access_key
AWS_SECRET_ACCESS_KEY=minio_secret_key
AWS_DEFAULT_REGION=us-east-1
AWS_BUCKET=bucket_name
AWS_ENDPOINT=http://127.0.0.1:9000

To test our connection we can use php artisan tinker and run Storage::disk('s3')->allFiles(); we should see [] because we have no files.

FTP Clients:
If you're like me, you'll prefer using an FTP client over a web browser. The tool of choice for me is Transmit 5 as I think it has the nicest UI and I'm confident in the S3 compatibility. Whilst I would recommend it, getting it to connect to MinIO was not straight forward. Upon making a new connection in Transmit, I set the protocol to AWS S3, set the Access Key ID and Secret from MinIO, set the Address to 127.0.0.1, Port to 9090, and Root URL to http://127.0.0.1 but got authentication issues. After much frustration and search result and search result, I found my saviour PHP Monitor. I used PHP Monitor to create a secure proxy domain of minio.test for http://127.0.0.1:9000. Back in Transmit I set the Address to minio.test and Root URL to https://minio.test/, and we had a successful connection!

Hopefully this has been a useful guide on setting up AWS S3 compatible storage locally on your development environment.