Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
.git
.gitignore
.DS_STORE

node_modules/*
Expand Down
14 changes: 5 additions & 9 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,13 +1,9 @@
FROM node:carbon-alpine
RUN npm install -g npm@5.7

FROM node:18-alpine
ENV PORT=80
EXPOSE 80
ADD package*.json /code/
WORKDIR /code
RUN npm ci --no-optional --production

CMD ["npm", "start"]
RUN npm install -g npm && npm ci --omit=optional --omit=dev && npm cache clean --force
ADD . /code/

ENV PORT=80
EXPOSE 80

CMD ["npm", "start"]
13 changes: 11 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ This reverse proxy will serve a private object over http at the expected (relati

### Prerequisites

- node lts/carbon
- Docker, or node 18+
- google cloud bucket
- google cloud authenticated machine - Make sure the host machine is authenticated with gcloud, and setup with the correct project.

Expand All @@ -36,6 +36,15 @@ The server will throw unless one or more TARGET_BUCKETS are defined.

Setting the `HISTORY` env var to true will start the server in history mode, for better SPA support. It will respond with the buckets root `index.html` file for any path that doesn't have a file extension.

If using containers and you want your container to use an application_default_credentials.json
file then bind-mount it into the container and set the environment variable, eg.

```
GOOGLE_APPLICATION_CREDENTIALS=/application_default_credentials.json
```

You might obtain that with `gcloud auth application-default login` for example, and find it in `${HOME}/.config/gcloud`

### Run

```
Expand Down Expand Up @@ -79,6 +88,6 @@ Because of the above, I recommend you use Compute Engine, Kubernetes Engine or A

### Docker

I've supplied a dockerfile for getting started without needing to install node. This will not work locally as it does not setup any authentication with google cloud sdk.
I've supplied a dockerfile for getting started without needing to install node. This will not work locally as it does not setup any authentication with google cloud sdk. See above in Environment for help getting it to work locally within Docker.

However when deployed to a compute engine VM instance authentication is handled for you. To see how to deploy a Docker image on a vm instance, check out [this guide](https://cloud.google.com/compute/docs/containers/deploying-containers). Don't forget to set the `TARGET_BUCKETS` environment variable, which can be done by following [this guide](https://cloud.google.com/compute/docs/containers/configuring-options-to-run-containers#setting_environment_variables).
25 changes: 18 additions & 7 deletions index.js
Original file line number Diff line number Diff line change
@@ -1,13 +1,24 @@
const {Storage} = require('@google-cloud/storage');
const gcs = new Storage();
const path = require("path");
const url = require("url");

const { send, createError } = require("micro");
const get = require("micro-get");
const compress = require("micro-compress");
const { send } = require("micro");
const get = function(fn) {
return (req, res) => {
res.setHeader('Access-Control-Request-Method', 'GET')
const {method} = req
if (method !== 'GET') {
res.writeHead(405)
res.end('Method Not Allowed')
return
}
return fn(req, res)
}
}

const gcs = require("@google-cloud/storage");
const compress = require("micro-compress");

const Storage = gcs();
const bucketRef = {};

const bucketPathRegexp = /^\/([^ \/]+)\/(.*)$/;
Expand Down Expand Up @@ -99,7 +110,7 @@ async function handleSingleBucket(req, res) {
const bucketName = allowedBuckets[0];

if (!bucketRef[bucketName]) {
bucketRef[bucketName] = await Storage.bucket(bucketName);
bucketRef[bucketName] = await gcs.bucket(bucketName);
}

const bucket = bucketRef[bucketName];
Expand All @@ -116,7 +127,7 @@ async function handleMultiBucket(req, res) {
let filePath = urlPathToFsPath(matches[2]);

if (!bucketRef[bucketName]) {
bucketRef[bucketName] = await Storage.bucket(bucketName);
bucketRef[bucketName] = await gcs.bucket(bucketName);
}

const bucket = bucketRef[bucketName];
Expand Down
Loading