Rewrite site backend in Rust (#178)

* add shell.nix changes for Rust #176

* set up base crate layout

* add first set of dependencies

* start adding basic app modules

* start html templates

* serve index page

* add contact and feeds pages

* add resume rendering support

* resume cleanups

* get signalboost page working

* rewrite config to be in dhall

* more work

* basic generic post loading

* more tests

* initial blog index support

* fix routing?

* render blogposts

* X-Clacks-Overhead

* split blog handlers into blog.rs

* gallery index

* gallery posts

* fix hashtags

* remove instantpage (it messes up the metrics)

* talk support + prometheus

* Create rust.yml

* Update rust.yml

* Update codeql-analysis.yml

* add jsonfeed library

* jsonfeed support

* rss/atom

* go mod tidy

* atom: add posted date

* rss: add publishing date

* nix: build rust program

* rip out go code

* rip out go templates

* prepare for serving in docker

* create kubernetes deployment

* create automagic deployment

* build docker images on non-master

* more fixes

* fix timestamps

* fix RSS/Atom/JSONFeed validation errors

* add go vanity import redirecting

* templates/header: remove this

* atom feed: fixes

* fix?

* fix??

* fix rust tests

* Update rust.yml

* automatically show snow during the winter

* fix dates

* show commit link in footer

* sitemap support

* fix compiler warning

* start basic patreon client

* integrate kankyo

* fix patreon client

* add patrons page

* remove this

* handle patron errors better

* fix build

* clean up deploy

* sort envvars for deploy

* remove deps.nix

* shell.nix: remove go

* update README

* fix envvars for tests

* nice

* blog: add rewrite in rust post

* blog/site-update: more words
This commit is contained in:
Cadey Ratio 2020-07-16 15:32:30 -04:00 committed by GitHub
parent 449e934246
commit 385d25c9f9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
110 changed files with 6352 additions and 3294 deletions

1
.gitattributes vendored
View File

@ -1,2 +1 @@
nix/deps.nix linguist-vendored
nix/sources.nix linguist-vendored

View File

@ -1,39 +0,0 @@
name: "Code scanning - action"
on:
push:
pull_request:
schedule:
- cron: '0 18 * * 6'
jobs:
CodeQL-Build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# If this run was triggered by a pull request event, then checkout
# the head of the pull request instead of the merge commit.
- run: git checkout HEAD^2
if: ${{ github.event_name == 'pull_request' }}
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
with:
languages: go
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v1
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@ -1,21 +0,0 @@
name: Go
on:
- push
- pull_request
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- name: Set up Go 1.14
uses: actions/setup-go@v1
with:
go-version: 1.14
id: go
- name: Check out code into the Go module directory
uses: actions/checkout@v1
- name: Test
run: go test -v ./...
env:
GO111MODULE: on
GOPROXY: https://cache.greedo.xeserv.us

View File

@ -1,80 +0,0 @@
name: "CI/CD"
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Build container image
run: |
docker build -t xena/christinewebsite:$(echo $GITHUB_SHA | head -c7) .
echo $DOCKER_PASSWORD | docker login -u $DOCKER_USERNAME --password-stdin
docker push xena/christinewebsite
env:
DOCKER_USERNAME: "xena"
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
- name: Download secrets/Install/Configure/Use Dyson
run: |
mkdir ~/.ssh
echo $FILE_DATA | base64 -d > ~/.ssh/id_rsa
md5sum ~/.ssh/id_rsa
chmod 600 ~/.ssh/id_rsa
git clone git@ssh.tulpa.dev:cadey/within-terraform-secret
curl https://xena.greedo.xeserv.us/files/dyson-linux-amd64-0.1.0.tgz | tar xz
cp ./dyson-linux-amd64-0.1.1/dyson .
rm -rf dyson-linux-amd64-0.1.1
mkdir -p ~/.config/dyson
echo '[DigitalOcean]
Token = ""
[Cloudflare]
Email = ""
Token = ""
[Secrets]
GitCheckout = "./within-terraform-secret"' > ~/.config/dyson/dyson.ini
./dyson manifest \
--name=christinewebsite \
--domain=christine.website \
--dockerImage=xena/christinewebsite:$(echo $GITHUB_SHA | head -c7) \
--containerPort=5000 \
--replicas=2 \
--useProdLE=true > $GITHUB_WORKSPACE/deploy.yml
env:
FILE_DATA: ${{ secrets.SSH_PRIVATE_KEY }}
GIT_SSH_COMMAND: "ssh -i ~/.ssh/id_rsa -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no"
- name: Save DigitalOcean kubeconfig
uses: digitalocean/action-doctl@master
env:
DIGITALOCEAN_ACCESS_TOKEN: ${{ secrets.DIGITALOCEAN_TOKEN }}
with:
args: kubernetes cluster kubeconfig show kubermemes > $GITHUB_WORKSPACE/.kubeconfig
- name: Deploy to DigitalOcean Kubernetes
uses: docker://lachlanevenson/k8s-kubectl
with:
args: --kubeconfig=/github/workspace/.kubeconfig apply -n apps -f /github/workspace/deploy.yml
- name: Verify deployment
uses: docker://lachlanevenson/k8s-kubectl
with:
args: --kubeconfig=/github/workspace/.kubeconfig rollout status -n apps deployment/christinewebsite
- name: Ping Google
uses: docker://lachlanevenson/k8s-kubectl
with:
args: --kubeconfig=/github/workspace/.kubeconfig apply -f /github/workspace/k8s/job.yml
- name: Sleep
run: |
sleep 5
- name: Don't Ping Google
uses: docker://lachlanevenson/k8s-kubectl
with:
args: --kubeconfig=/github/workspace/.kubeconfig delete -f /github/workspace/k8s/job.yml
- name: POSSE
env:
MI_TOKEN: ${{ secrets.MI_TOKEN }}
run: |
curl -H "Authorization: $MI_TOKEN" --data "https://christine.website/blog.json" https://mi.within.website/blog/refresh

View File

@ -1,16 +1,42 @@
name: "Nix"
on:
push:
branches:
- master
pull_request:
branches:
- master
jobs:
tests:
docker-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- uses: cachix/install-nix-action@v6
- uses: cachix/cachix-action@v3
with:
name: xe
- run: |
nix-build docker.nix
docker load -i result
docker tag xena/christinewebsite:latest xena/christinewebsite:$(echo $GITHUB_SHA | head -c7)
- uses: actions/checkout@v1
- uses: cachix/install-nix-action@v6
- uses: cachix/cachix-action@v3
with:
name: xe
signingKey: '${{ secrets.CACHIX_SIGNING_KEY }}'
authToken: '${{ secrets.CACHIX_AUTH_TOKEN }}'
- run: |
docker load -i result
docker tag xena/christinewebsite:latest xena/christinewebsite:$GITHUB_SHA
echo $DOCKER_PASSWORD | docker login -u $DOCKER_USERNAME --password-stdin
docker push xena/christinewebsite
env:
DOCKER_USERNAME: "xena"
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
release:
runs-on: ubuntu-latest
needs: docker-build
if: github.ref == 'refs/heads/master'
steps:
- uses: cachix/install-nix-action@v6
- name: deploy
run: ./scripts/release.sh
env:
DIGITALOCEAN_ACCESS_TOKEN: ${{ secrets.DIGITALOCEAN_TOKEN }}
MI_TOKEN: ${{ secrets.MI_TOKEN }}
PATREON_ACCESS_TOKEN: ${{ secrets.PATREON_ACCESS_TOKEN }}
PATREON_CLIENT_ID: ${{ secrets.PATREON_CLIENT_ID }}
PATREON_CLIENT_SECRET: ${{ secrets.PATREON_CLIENT_SECRET }}
PATREON_REFRESH_TOKEN: ${{ secrets.PATREON_REFRESH_TOKEN }}

25
.github/workflows/rust.yml vendored Normal file
View File

@ -0,0 +1,25 @@
name: Rust
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Build
run: cargo build --all
- name: Run tests
run: |
cargo test
(cd lib/jsonfeed && cargo test)
(cd lib/patreon && cargo test)
env:
PATREON_ACCESS_TOKEN: ${{ secrets.PATREON_ACCESS_TOKEN }}
PATREON_CLIENT_ID: ${{ secrets.PATREON_CLIENT_ID }}
PATREON_CLIENT_SECRET: ${{ secrets.PATREON_CLIENT_SECRET }}
PATREON_REFRESH_TOKEN: ${{ secrets.PATREON_REFRESH_TOKEN }}

2
.gitignore vendored
View File

@ -5,4 +5,4 @@ cw.tar
/result-*
/result
.#*
/target

2592
Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

48
Cargo.toml Normal file
View File

@ -0,0 +1,48 @@
[package]
name = "xesite"
version = "2.0.0"
authors = ["Christine Dodrill <me@christine.website>"]
edition = "2018"
build = "src/build.rs"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
anyhow = "1"
atom_syndication = { version = "0.9", features = ["with-serde"] }
chrono = "0.4"
comrak = "0.8"
envy = "0.4"
glob = "0.3"
hyper = "0.13"
kankyo = "0.3"
lazy_static = "1.4"
log = "0"
mime = "0.3.0"
pretty_env_logger = "0"
prometheus = { version = "0.9", default-features = false, features = ["process"] }
rand = "0"
rss = "1"
serde_dhall = "0.5.3"
serde = { version = "1", features = ["derive"] }
serde_yaml = "0.8"
sitemap = "0.4"
thiserror = "1"
tokio = { version = "0.2", features = ["macros"] }
warp = "0.2"
xml-rs = "0.8"
# workspace dependencies
go_vanity = { path = "./lib/go_vanity" }
jsonfeed = { path = "./lib/jsonfeed" }
patreon = { path = "./lib/patreon" }
[build-dependencies]
ructe = { version = "0.11", features = ["warp02"] }
[workspace]
members = [
"./lib/go_vanity",
"./lib/jsonfeed",
"./lib/patreon"
]

View File

@ -1,20 +0,0 @@
FROM xena/go:1.14 AS build
ENV GOPROXY https://cache.greedo.xeserv.us
COPY . /site
WORKDIR /site
RUN CGO_ENABLED=0 go test -v ./...
RUN CGO_ENABLED=0 GOBIN=/root go install -v ./cmd/site
FROM xena/alpine
EXPOSE 5000
WORKDIR /site
COPY --from=build /root/site .
COPY ./static /site/static
COPY ./templates /site/templates
COPY ./blog /site/blog
COPY ./talks /site/talks
COPY ./gallery /site/gallery
COPY ./css /site/css
COPY ./signalboost.dhall /site/signalboost.dhall
HEALTHCHECK CMD wget --spider http://127.0.0.1:5000/.within/health || exit 1
CMD ./site

View File

@ -1,4 +1,4 @@
Copyright (c) 2017 Christine Dodrill <me@christine.website>
Copyright (c) 2017-2020 Christine Dodrill <me@christine.website>
This software is provided 'as-is', without any express or implied
warranty. In no event will the authors be held liable for any damages

View File

@ -1,5 +1,8 @@
# site
My personal/portfolio website.
[![built with
nix](https://builtwithnix.org/badge.svg)](https://builtwithnix.org)
![Nix](https://github.com/Xe/site/workflows/Nix/badge.svg)
![Rust](https://github.com/Xe/site/workflows/Rust/badge.svg)
![https://puu.sh/vWnJx/57cda175d8.png](https://puu.sh/vWnJx/57cda175d8.png)
My personal/portfolio website.

View File

@ -0,0 +1,189 @@
---
title: "Site Update: Rewrite in Rust"
date: 2020-07-16
tags:
- rust
---
# Site Update: Rewrite in Rust
Hello there! You are reading this post thanks to a lot of effort, research and
consultation that has resulted in a complete from-scratch rewrite of this
website in [Rust](https://rust-lang.org). The original implementation in Go is
available [here](https://github.com/Xe/site/releases/tag/v1.5.0) should anyone
want to reference that for any reason.
If you find any issues with the [RSS feed](/blog.rss), [Atom feed](/blog.atom)
or [JSONFeed](/blog.json), please let me know as soon as possible so I can fix
them.
This website stands on the shoulder of giants. Here are just a few of those and
how they add up into this whole package.
## comrak
All of my posts are written in
[markdown](https://github.com/Xe/site/blob/master/blog/all-there-is-is-now-2019-05-25.markdown).
[comrak](https://github.com/kivikakk/comrak) is a markdown parser written by a
friend of mine that is as fast and as correct as possible. comrak does the job
of turning all of that markdown (over 150 files at the time of writing this
post) into the HTML that you are reading right now. It also supports a lot of
common markdown extensions, which I use heavily in my posts.
## warp
[warp](https://github.com/seanmonstar/warp) is the web framework I use for Rust.
It gives users a set of filters that add up into entire web applications. For an
example, see this example from its readme:
```rust
use warp::Filter;
#[tokio::main]
async fn main() {
// GET /hello/warp => 200 OK with body "Hello, warp!"
let hello = warp::path!("hello" / String)
.map(|name| format!("Hello, {}!", name));
warp::serve(hello)
.run(([127, 0, 0, 1], 3030))
.await;
}
```
This can then be built up into something like this:
```rust
let site = index
.or(contact.or(feeds).or(resume.or(signalboost)).or(patrons))
.or(blog_index.or(series.or(series_view).or(post_view)))
.or(gallery_index.or(gallery_post_view))
.or(talk_index.or(talk_post_view))
.or(jsonfeed.or(atom).or(rss.or(sitemap)))
.or(files.or(css).or(favicon).or(sw.or(robots)))
.or(healthcheck.or(metrics_endpoint).or(go_vanity_jsonfeed))
// ...
```
which is the actual routing setup for this website!
## ructe
In the previous version of this site, I used Go's
[html/template](https://godoc.org/html/template). Rust does not have an
equivalent of html/template in its standard library. After some research, I
settled on [ructe](https://github.com/kaj/ructe) for the HTML templates. ructe
works by preprocessing templates using a little domain-specific language that
compiles down to Rust source code. This makes the templates become optimized
with the rest of the program and enables my website to render most pages in less
than 100 microseconds. Here is an example template (the one for
[/patrons](/patrons)):
```html
@use patreon::Users;
@use super::{header_html, footer_html};
@(users: Users)
@:header_html(Some("Patrons"), None)
<h1>Patrons</h1>
<p>These awesome people donate to me on <a href="https://patreon.com/cadey">Patreon</a>.
If you would like to show up in this list, please donate to me on Patreon. This
is refreshed every time the site is deployed.</p>
<p>
<ul>
@for user in users {
<li>@user.attributes.full_name</li>
}
</ul>
</p>
@:footer_html()
```
The templates compile down to Rust, which lets me include other parts of the
program into the templates. Here I use that to take a list of users from the
incredibly hacky Patreon API client I wrote for this website and iterate over
it, making a list of every patron by name.
## Build Process
As a nice side effect of this rewrite, my website is now completely built using
[Nix](https://nixos.org/). This allows the website to be built reproducibly, as
well as a full development environment setup for free for anyone that checks out
the repo and runs `nix-shell`. Check out
[naersk](https://github.com/nmattia/naersk) for the secret sauce that enables my
docker image build. See [this blogpost](/blog/drone-kubernetes-cd-2020-07-10)
for more information about this build process (though my site uses GitHub
Actions instead of Drone).
## `jsonfeed` Go package
I used to have a [JSONFeed](https://www.jsonfeed.org/) package publicly visible
at the go import path `christine.website/jsonfeed`. As far as I know I'm the
only person who ended up using it; but in case there are any private repos that
I don't know about depending on it, I have made the jsonfeed package available
at its old location as well as its source code
[here](https://tulpa.dev/Xe/jsonfeed). You may have to update your `go.mod` file
to import `christine.website/jsonfeed` instead of `christine.website`. If
something ends up going wrong as a result of this, please [file a GitHub issue
here](https://github.com/Xe/site/issues/new) and I can attempt to assist
further.
## `go_vanity` crate
I have written a small go vanity import crate and exposed it in my Git repo. If
you want to use it, add it to your `Cargo.toml` like this:
```toml
[dependencies]
go_vanity = { git = "https://github.com/Xe/site", branch = "master" }
```
You can then use it from any warp application by calling `go_vanity::github` or
`go_vanity::gitea` like this:
```rust
let go_vanity_jsonfeed = warp::path("jsonfeed")
.and(warp::any().map(move || "christine.website/jsonfeed"))
.and(warp::any().map(move || "https://tulpa.dev/Xe/jsonfeed"))
.and_then(go_vanity::gitea);
```
I plan to add full documentation to this crate soon as well as release it
properly on crates.io.
## `patreon` crate
I have also written a small [Patreon](https://www.patreon.com/) API client and
made it available in my Git repo. If you want to use it, add it to your
`Cargo.toml` like this:
```toml
[dependencies]
patreon = { git = "https://github.com/Xe/site", branch = "master" }
```
This client is _incredibly limited_ and only supports the minimum parts of the
Patreon API that are required for my website to function. Patreon has also
apparently started to phase out support for its API anyways, so I don't know how
long this will be useful.
But this is there should you need it!
## Dhall Kubernetes Manifest
I also took the time to port the kubernetes manifest to
[Dhall](https://dhall-lang.org/). This allows me to have a type-safe kubernetes
manifest that will correctly have all of the secrets injected for me from the
environment of the deploy script.
---
These are the biggest giants that my website now sits on. The code for this
rewrite is still a bit messy. I'm working on making it better, but my goal is to
have this website's code shine as an example of how to best write this kind of
website in Rust. Check out the code [here](https://github.com/Xe/site).

View File

@ -1,25 +0,0 @@
package main
import (
"math/rand"
"net/http"
"time"
)
type ClackSet []string
func (cs ClackSet) Name() string {
return "GNU " + cs[rand.Intn(len(cs))]
}
func (cs ClackSet) Middleware(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.Header().Add("X-Clacks-Overhead", cs.Name())
next.ServeHTTP(w, r)
})
}
func init() {
rand.Seed(time.Now().Unix())
}

View File

@ -1,245 +0,0 @@
package main
import (
"context"
"fmt"
"html/template"
"net/http"
"path/filepath"
"strings"
"time"
"christine.website/cmd/site/internal"
"christine.website/cmd/site/internal/blog"
"github.com/prometheus/client_golang/prometheus"
"github.com/prometheus/client_golang/prometheus/promauto"
"within.website/ln"
"within.website/ln/opname"
)
var (
templateRenderTime = promauto.NewHistogramVec(prometheus.HistogramOpts{
Name: "template_render_time",
Help: "Template render time in nanoseconds",
}, []string{"name"})
)
func logTemplateTime(ctx context.Context, name string, f ln.F, from time.Time) {
dur := time.Since(from)
templateRenderTime.With(prometheus.Labels{"name": name}).Observe(float64(dur))
ln.Log(ctx, f, ln.F{"dur": dur, "name": name})
}
func (s *Site) renderTemplatePage(templateFname string, data interface{}) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ctx := opname.With(r.Context(), "renderTemplatePage")
fetag := "W/" + internal.Hash(templateFname, etag) + "-1"
f := ln.F{"etag": fetag, "if_none_match": r.Header.Get("If-None-Match")}
if r.Header.Get("If-None-Match") == fetag {
http.Error(w, "Cached data OK", http.StatusNotModified)
ln.Log(ctx, f, ln.Info("Cache hit"))
return
}
defer logTemplateTime(ctx, templateFname, f, time.Now())
var t *template.Template
var err error
t, err = template.ParseFiles("templates/base.html", "templates/"+templateFname)
if err != nil {
w.WriteHeader(http.StatusInternalServerError)
ln.Error(ctx, err, ln.F{"action": "renderTemplatePage", "page": templateFname})
fmt.Fprintf(w, "error: %v", err)
}
w.Header().Set("ETag", fetag)
w.Header().Set("Cache-Control", "max-age=432000")
err = t.Execute(w, data)
if err != nil {
panic(err)
}
})
}
var postView = promauto.NewCounterVec(prometheus.CounterOpts{
Name: "posts_viewed",
Help: "The number of views per post or talk",
}, []string{"base"})
func (s *Site) listSeries(w http.ResponseWriter, r *http.Request) {
s.renderTemplatePage("series.html", s.Series).ServeHTTP(w, r)
}
func (s *Site) showSeries(w http.ResponseWriter, r *http.Request) {
if r.RequestURI == "/blog/series/" {
http.Redirect(w, r, "/blog/series", http.StatusSeeOther)
return
}
series := filepath.Base(r.URL.Path)
var posts []blog.Post
for _, p := range s.Posts {
if p.Series == series {
posts = append(posts, p)
}
}
s.renderTemplatePage("serieslist.html", struct {
Name string
Posts []blog.Post
}{
Name: series,
Posts: posts,
}).ServeHTTP(w, r)
}
func (s *Site) showGallery(w http.ResponseWriter, r *http.Request) {
if r.RequestURI == "/gallery/" {
http.Redirect(w, r, "/gallery", http.StatusSeeOther)
return
}
cmp := r.URL.Path[1:]
var p blog.Post
var found bool
for _, pst := range s.Gallery {
if pst.Link == cmp {
p = pst
found = true
}
}
if !found {
w.WriteHeader(http.StatusNotFound)
s.renderTemplatePage("error.html", "no such post found: "+r.RequestURI).ServeHTTP(w, r)
return
}
var tags string
if len(p.Tags) != 0 {
for _, t := range p.Tags {
tags = tags + " #" + strings.ReplaceAll(t, "-", "")
}
}
h := s.renderTemplatePage("gallerypost.html", struct {
Title string
Link string
BodyHTML template.HTML
Date string
Tags string
Image string
}{
Title: p.Title,
Link: p.Link,
BodyHTML: p.BodyHTML,
Date: internal.IOS13Detri(p.Date),
Tags: tags,
Image: p.ImageURL,
})
if h == nil {
panic("how did we get here?")
}
h.ServeHTTP(w, r)
postView.With(prometheus.Labels{"base": filepath.Base(p.Link)}).Inc()
}
func (s *Site) showTalk(w http.ResponseWriter, r *http.Request) {
if r.RequestURI == "/talks/" {
http.Redirect(w, r, "/talks", http.StatusSeeOther)
return
}
cmp := r.URL.Path[1:]
var p blog.Post
var found bool
for _, pst := range s.Talks {
if pst.Link == cmp {
p = pst
found = true
}
}
if !found {
w.WriteHeader(http.StatusNotFound)
s.renderTemplatePage("error.html", "no such post found: "+r.RequestURI).ServeHTTP(w, r)
return
}
h := s.renderTemplatePage("talkpost.html", struct {
Title string
Link string
BodyHTML template.HTML
Date string
SlidesLink string
}{
Title: p.Title,
Link: p.Link,
BodyHTML: p.BodyHTML,
Date: internal.IOS13Detri(p.Date),
SlidesLink: p.SlidesLink,
})
if h == nil {
panic("how did we get here?")
}
h.ServeHTTP(w, r)
postView.With(prometheus.Labels{"base": filepath.Base(p.Link)}).Inc()
}
func (s *Site) showPost(w http.ResponseWriter, r *http.Request) {
if r.RequestURI == "/blog/" {
http.Redirect(w, r, "/blog", http.StatusSeeOther)
return
}
cmp := r.URL.Path[1:]
var p blog.Post
var found bool
for _, pst := range s.Posts {
if pst.Link == cmp {
p = pst
found = true
}
}
if !found {
w.WriteHeader(http.StatusNotFound)
s.renderTemplatePage("error.html", "no such post found: "+r.RequestURI).ServeHTTP(w, r)
return
}
var tags string
if len(p.Tags) != 0 {
for _, t := range p.Tags {
tags = tags + " #" + strings.ReplaceAll(t, "-", "")
}
}
s.renderTemplatePage("blogpost.html", struct {
Title string
Link string
BodyHTML template.HTML
Date string
Series, SeriesTag string
Tags string
}{
Title: p.Title,
Link: p.Link,
BodyHTML: p.BodyHTML,
Date: internal.IOS13Detri(p.Date),
Series: p.Series,
SeriesTag: strings.ReplaceAll(p.Series, "-", ""),
Tags: tags,
}).ServeHTTP(w, r)
postView.With(prometheus.Labels{"base": filepath.Base(p.Link)}).Inc()
}

View File

@ -1,137 +0,0 @@
package blog
import (
"html/template"
"io/ioutil"
"os"
"path/filepath"
"sort"
"strings"
"time"
"christine.website/cmd/site/internal/front"
"github.com/russross/blackfriday"
)
// Post is a single blogpost.
type Post struct {
Title string `json:"title"`
Link string `json:"link"`
Summary string `json:"summary,omitifempty"`
Body string `json:"-"`
BodyHTML template.HTML `json:"body"`
Series string `json:"series"`
Tags []string `json:"tags"`
SlidesLink string `json:"slides_link"`
ImageURL string `json:"image_url"`
ThumbURL string `json:"thumb_url"`
Date time.Time
DateString string `json:"date"`
}
// Posts implements sort.Interface for a slice of Post objects.
type Posts []Post
func (p Posts) Series() []string {
names := map[string]struct{}{}
for _, ps := range p {
if ps.Series != "" {
names[ps.Series] = struct{}{}
}
}
var result []string
for name := range names {
result = append(result, name)
}
return result
}
func (p Posts) Len() int { return len(p) }
func (p Posts) Less(i, j int) bool {
iDate := p[i].Date
jDate := p[j].Date
return iDate.Unix() < jDate.Unix()
}
func (p Posts) Swap(i, j int) { p[i], p[j] = p[j], p[i] }
// LoadPosts loads posts for a given directory.
func LoadPosts(path string, prepend string) (Posts, error) {
type postFM struct {
Title string
Date string
Series string
Tags []string
SlidesLink string `yaml:"slides_link"`
Image string
Thumb string
Show string
}
var result Posts
err := filepath.Walk(path, func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if info.IsDir() {
return nil
}
fin, err := os.Open(path)
if err != nil {
return err
}
defer fin.Close()
content, err := ioutil.ReadAll(fin)
if err != nil {
return err
}
var fm postFM
remaining, err := front.Unmarshal(content, &fm)
if err != nil {
return err
}
output := blackfriday.Run(remaining)
const timeFormat = `2006-01-02`
date, err := time.Parse(timeFormat, fm.Date)
if err != nil {
return err
}
fname := filepath.Base(path)
fname = strings.TrimSuffix(fname, filepath.Ext(fname))
p := Post{
Title: fm.Title,
Date: date,
DateString: fm.Date,
Link: filepath.Join(prepend, fname),
Body: string(remaining),
BodyHTML: template.HTML(output),
SlidesLink: fm.SlidesLink,
Series: fm.Series,
Tags: fm.Tags,
ImageURL: fm.Image,
ThumbURL: fm.Thumb,
}
result = append(result, p)
return nil
})
if err != nil {
return nil, err
}
sort.Sort(sort.Reverse(result))
return result, nil
}

View File

@ -1,66 +0,0 @@
package blog
import (
"testing"
)
func TestLoadPosts(t *testing.T) {
posts, err := LoadPosts("../../../../blog", "blog")
if err != nil {
t.Fatal(err)
}
for _, post := range posts {
t.Run(post.Link, post.test)
}
}
func TestLoadTalks(t *testing.T) {
talks, err := LoadPosts("../../../../talks", "talks")
if err != nil {
t.Fatal(err)
}
for _, talk := range talks {
t.Run(talk.Link, talk.test)
if talk.SlidesLink == "" {
t.Errorf("talk %s (%s) doesn't have a slides link", talk.Title, talk.DateString)
}
}
}
func TestLoadGallery(t *testing.T) {
gallery, err := LoadPosts("../../../../gallery", "gallery")
if err != nil {
t.Fatal(err)
}
for _, art := range gallery {
t.Run(art.Link, art.test)
if art.ImageURL == "" {
t.Errorf("art %s (%s) doesn't have an image link", art.Title, art.DateString)
}
if art.ThumbURL == "" {
t.Errorf("art %s (%s) doesn't have a thumbnail link", art.Title, art.DateString)
}
}
}
func (p Post) test(t *testing.T) {
if p.Title == "" {
t.Error("no post title")
}
if p.DateString == "" {
t.Error("no date")
}
if p.Link == "" {
t.Error("no link")
}
if p.Body == "" {
t.Error("no body")
}
}

View File

@ -1,10 +0,0 @@
package internal
import "time"
const iOS13DetriFormat = `2006 M1 2`
// IOS13Detri formats a datestamp like iOS 13 does with the Lojban locale.
func IOS13Detri(t time.Time) string {
return t.Format(iOS13DetriFormat)
}

View File

@ -1,28 +0,0 @@
package internal
import (
"fmt"
"testing"
"time"
)
func TestIOS13Detri(t *testing.T) {
cases := []struct {
in time.Time
out string
}{
{
in: time.Date(2019, time.March, 30, 0, 0, 0, 0, time.FixedZone("UTC", 0)),
out: "2019 M3 30",
},
}
for _, cs := range cases {
t.Run(fmt.Sprintf("%s -> %s", cs.in.Format(time.RFC3339), cs.out), func(t *testing.T) {
result := IOS13Detri(cs.in)
if result != cs.out {
t.Fatalf("wanted: %s, got: %s", cs.out, result)
}
})
}
}

View File

@ -1,19 +0,0 @@
Copyright (c) 2017 TJ Holowaychuk <tj@vision-media.ca>
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -1,24 +0,0 @@
// Package front provides YAML frontmatter unmarshalling.
package front
import (
"bytes"
"gopkg.in/yaml.v2"
)
// Delimiter.
var delim = []byte("---")
// Unmarshal parses YAML frontmatter and returns the content. When no
// frontmatter delimiters are present the original content is returned.
func Unmarshal(b []byte, v interface{}) (content []byte, err error) {
if !bytes.HasPrefix(b, delim) {
return b, nil
}
parts := bytes.SplitN(b, delim, 3)
content = parts[2]
err = yaml.Unmarshal(parts[1], v)
return
}

View File

@ -1,42 +0,0 @@
package front_test
import (
"fmt"
"log"
"christine.website/cmd/site/internal/front"
)
var markdown = []byte(`---
title: Ferrets
authors:
- Tobi
- Loki
- Jane
---
Some content here, so
interesting, you just
want to keep reading.`)
type article struct {
Title string
Authors []string
}
func Example() {
var a article
content, err := front.Unmarshal(markdown, &a)
if err != nil {
log.Fatalf("error unmarshalling: %s", err)
}
fmt.Printf("%#v\n", a)
fmt.Printf("%s\n", string(content))
// Output:
// front_test.article{Title:"Ferrets", Authors:[]string{"Tobi", "Loki", "Jane"}}
//
// Some content here, so
// interesting, you just
// want to keep reading.
}

View File

@ -1,14 +0,0 @@
package internal
import (
"crypto/md5"
"fmt"
)
// Hash is a simple wrapper around the MD5 algorithm implementation in the
// Go standard library. It takes in data and a salt and returns the hashed
// representation.
func Hash(data string, salt string) string {
output := md5.Sum([]byte(data + salt))
return fmt.Sprintf("%x", output)
}

View File

@ -1,43 +0,0 @@
package middleware
import (
"net/http"
"github.com/prometheus/client_golang/prometheus"
"github.com/prometheus/client_golang/prometheus/promhttp"
)
var (
requestCounter = prometheus.NewCounterVec(
prometheus.CounterOpts{
Name: "handler_requests_total",
Help: "Total number of request/responses by HTTP status code.",
}, []string{"handler", "code"})
requestDuration = prometheus.NewHistogramVec(prometheus.HistogramOpts{
Name: "handler_request_duration",
Help: "Handler request duration.",
}, []string{"handler", "method"})
requestInFlight = prometheus.NewGaugeVec(prometheus.GaugeOpts{
Name: "handler_requests_in_flight",
Help: "Current number of requests being served.",
}, []string{"handler"})
)
func init() {
_ = prometheus.Register(requestCounter)
_ = prometheus.Register(requestDuration)
_ = prometheus.Register(requestInFlight)
}
// Metrics captures request duration, request count and in-flight request count
// metrics for HTTP handlers. The family field is used to discriminate handlers.
func Metrics(family string, next http.Handler) http.Handler {
return promhttp.InstrumentHandlerDuration(
requestDuration.MustCurryWith(prometheus.Labels{"handler": family}),
promhttp.InstrumentHandlerCounter(requestCounter.MustCurryWith(prometheus.Labels{"handler": family}),
promhttp.InstrumentHandlerInFlight(requestInFlight.With(prometheus.Labels{"handler": family}), next),
),
)
}

View File

@ -1,31 +0,0 @@
package middleware
import (
"net/http"
"github.com/celrenheit/sandflake"
"within.website/ln"
)
// RequestID appends a unique (sandflake) request ID to each request's
// X-Request-Id header field, much like Heroku's router does.
func RequestID(next http.Handler) http.Handler {
var g sandflake.Generator
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
id := g.Next().String()
if rid := r.Header.Get("X-Request-Id"); rid != "" {
id = rid + "," + id
}
ctx := ln.WithF(r.Context(), ln.F{
"request_id": id,
})
r = r.WithContext(ctx)
w.Header().Set("X-Request-Id", id)
r.Header.Set("X-Request-Id", id)
next.ServeHTTP(w, r)
})
}

View File

@ -1,296 +0,0 @@
package main
import (
"context"
"html/template"
"io/ioutil"
"net/http"
"os"
"sort"
"strings"
"time"
"christine.website/cmd/site/internal/blog"
"christine.website/cmd/site/internal/middleware"
"christine.website/jsonfeed"
"github.com/gorilla/feeds"
_ "github.com/joho/godotenv/autoload"
"github.com/povilasv/prommod"
"github.com/prometheus/client_golang/prometheus"
"github.com/prometheus/client_golang/prometheus/promhttp"
blackfriday "github.com/russross/blackfriday"
"github.com/sebest/xff"
"github.com/snabb/sitemap"
"within.website/ln"
"within.website/ln/ex"
"within.website/ln/opname"
)
var port = os.Getenv("PORT")
func main() {
if port == "" {
port = "29384"
}
ctx := ln.WithF(opname.With(context.Background(), "main"), ln.F{
"port": port,
"git_rev": gitRev,
})
_ = prometheus.Register(prommod.NewCollector("christine"))
s, err := Build()
if err != nil {
ln.FatalErr(ctx, err, ln.Action("Build"))
}
mux := http.NewServeMux()
mux.HandleFunc("/.within/health", func(w http.ResponseWriter, r *http.Request) {
http.Error(w, "OK", http.StatusOK)
})
mux.Handle("/", s)
ln.Log(ctx, ln.Action("http_listening"))
ln.FatalErr(ctx, http.ListenAndServe(":"+port, mux))
}
// Site is the parent object for https://christine.website's backend.
type Site struct {
Posts blog.Posts
Talks blog.Posts
Gallery blog.Posts
Resume template.HTML
Series []string
SignalBoost []Person
clacks ClackSet
patrons []string
rssFeed *feeds.Feed
jsonFeed *jsonfeed.Feed
mux *http.ServeMux
xffmw *xff.XFF
}
var gitRev = os.Getenv("GIT_REV")
func envOr(key, or string) string {
if result, ok := os.LookupEnv(key); ok {
return result
}
return or
}
func (s *Site) ServeHTTP(w http.ResponseWriter, r *http.Request) {
ctx := opname.With(r.Context(), "site.ServeHTTP")
ctx = ln.WithF(ctx, ln.F{
"user_agent": r.Header.Get("User-Agent"),
})
r = r.WithContext(ctx)
if gitRev != "" {
w.Header().Add("X-Git-Rev", gitRev)
}
w.Header().Add("X-Hacker", "If you are reading this, check out /signalboost to find people for your team")
s.clacks.Middleware(
middleware.RequestID(
s.xffmw.Handler(
ex.HTTPLog(s.mux),
),
),
).ServeHTTP(w, r)
}
var arbDate = time.Date(2020, time.May, 21, 0, 0, 0, 0, time.UTC)
// Build creates a new Site instance or fails.
func Build() (*Site, error) {
pc, err := NewPatreonClient()
if err != nil {
return nil, err
}
pledges, err := GetPledges(pc)
if err != nil {
return nil, err
}
people, err := loadPeople("./signalboost.dhall")
if err != nil {
return nil, err
}
smi := sitemap.New()
smi.Add(&sitemap.URL{
Loc: "https://christine.website/resume",
LastMod: &arbDate,
ChangeFreq: sitemap.Monthly,
})
smi.Add(&sitemap.URL{
Loc: "https://christine.website/contact",
LastMod: &arbDate,
ChangeFreq: sitemap.Monthly,
})
smi.Add(&sitemap.URL{
Loc: "https://christine.website/",
LastMod: &arbDate,
ChangeFreq: sitemap.Monthly,
})
smi.Add(&sitemap.URL{
Loc: "https://christine.website/patrons",
LastMod: &arbDate,
ChangeFreq: sitemap.Weekly,
})
smi.Add(&sitemap.URL{
Loc: "https://christine.website/blog",
LastMod: &arbDate,
ChangeFreq: sitemap.Weekly,
})
xffmw, err := xff.Default()
if err != nil {
return nil, err
}
s := &Site{
rssFeed: &feeds.Feed{
Title: "Christine Dodrill's Blog",
Link: &feeds.Link{Href: "https://christine.website/blog"},
Description: "My blog posts and rants about various technology things.",
Author: &feeds.Author{Name: "Christine Dodrill", Email: "me@christine.website"},
Created: bootTime,
Copyright: "This work is copyright Christine Dodrill. My viewpoints are my own and not the view of any employer past, current or future.",
},
jsonFeed: &jsonfeed.Feed{
Version: jsonfeed.CurrentVersion,
Title: "Christine Dodrill's Blog",
HomePageURL: "https://christine.website",
FeedURL: "https://christine.website/blog.json",
Description: "My blog posts and rants about various technology things.",
UserComment: "This is a JSON feed of my blogposts. For more information read: https://jsonfeed.org/version/1",
Icon: icon,
Favicon: icon,
Author: jsonfeed.Author{
Name: "Christine Dodrill",
Avatar: icon,
},
},
mux: http.NewServeMux(),
xffmw: xffmw,
clacks: ClackSet(strings.Split(envOr("CLACK_SET", "Ashlynn"), ",")),
patrons: pledges,
SignalBoost: people,
}
posts, err := blog.LoadPosts("./blog/", "blog")
if err != nil {
return nil, err
}
s.Posts = posts
s.Series = posts.Series()
sort.Strings(s.Series)
talks, err := blog.LoadPosts("./talks", "talks")
if err != nil {
return nil, err
}
s.Talks = talks
gallery, err := blog.LoadPosts("./gallery", "gallery")
if err != nil {
return nil, err
}
s.Gallery = gallery
var everything blog.Posts
everything = append(everything, posts...)
everything = append(everything, talks...)
everything = append(everything, gallery...)
sort.Sort(sort.Reverse(everything))
resumeData, err := ioutil.ReadFile("./static/resume/resume.md")
if err != nil {
return nil, err
}
s.Resume = template.HTML(blackfriday.Run(resumeData))
for _, item := range everything {
s.rssFeed.Items = append(s.rssFeed.Items, &feeds.Item{
Title: item.Title,
Link: &feeds.Link{Href: "https://christine.website/" + item.Link},
Description: item.Summary,
Created: item.Date,
Content: string(item.BodyHTML),
})
s.jsonFeed.Items = append(s.jsonFeed.Items, jsonfeed.Item{
ID: "https://christine.website/" + item.Link,
URL: "https://christine.website/" + item.Link,
Title: item.Title,
DatePublished: item.Date,
ContentHTML: string(item.BodyHTML),
Tags: item.Tags,
})
smi.Add(&sitemap.URL{
Loc: "https://christine.website/" + item.Link,
LastMod: &item.Date,
ChangeFreq: sitemap.Monthly,
})
}
// Add HTTP routes here
s.mux.HandleFunc("/", func(w http.ResponseWriter, r *http.Request) {
if r.URL.Path != "/" {
w.WriteHeader(http.StatusNotFound)
s.renderTemplatePage("error.html", "can't find "+r.URL.Path).ServeHTTP(w, r)
return
}
s.renderTemplatePage("index.html", nil).ServeHTTP(w, r)
})
s.mux.Handle("/metrics", promhttp.Handler())
s.mux.Handle("/feeds", middleware.Metrics("feeds", s.renderTemplatePage("feeds.html", nil)))
s.mux.Handle("/patrons", middleware.Metrics("patrons", s.renderTemplatePage("patrons.html", s.patrons)))
s.mux.Handle("/signalboost", middleware.Metrics("signalboost", s.renderTemplatePage("signalboost.html", s.SignalBoost)))
s.mux.Handle("/resume", middleware.Metrics("resume", s.renderTemplatePage("resume.html", s.Resume)))
s.mux.Handle("/blog", middleware.Metrics("blog", s.renderTemplatePage("blogindex.html", s.Posts)))
s.mux.Handle("/talks", middleware.Metrics("talks", s.renderTemplatePage("talkindex.html", s.Talks)))
s.mux.Handle("/gallery", middleware.Metrics("gallery", s.renderTemplatePage("galleryindex.html", s.Gallery)))
s.mux.Handle("/contact", middleware.Metrics("contact", s.renderTemplatePage("contact.html", nil)))
s.mux.Handle("/blog.rss", middleware.Metrics("blog.rss", http.HandlerFunc(s.createFeed)))
s.mux.Handle("/blog.atom", middleware.Metrics("blog.atom", http.HandlerFunc(s.createAtom)))
s.mux.Handle("/blog.json", middleware.Metrics("blog.json", http.HandlerFunc(s.createJSONFeed)))
s.mux.Handle("/blog/", middleware.Metrics("blogpost", http.HandlerFunc(s.showPost)))
s.mux.Handle("/blog/series", http.HandlerFunc(s.listSeries))
s.mux.Handle("/blog/series/", http.HandlerFunc(s.showSeries))
s.mux.Handle("/talks/", middleware.Metrics("talks", http.HandlerFunc(s.showTalk)))
s.mux.Handle("/gallery/", middleware.Metrics("gallery", http.HandlerFunc(s.showGallery)))
s.mux.Handle("/css/", http.FileServer(http.Dir(".")))
s.mux.Handle("/static/", http.FileServer(http.Dir(".")))
s.mux.HandleFunc("/sw.js", func(w http.ResponseWriter, r *http.Request) {
http.ServeFile(w, r, "./static/js/sw.js")
})
s.mux.HandleFunc("/robots.txt", func(w http.ResponseWriter, r *http.Request) {
http.ServeFile(w, r, "./static/robots.txt")
})
s.mux.Handle("/sitemap.xml", middleware.Metrics("sitemap", http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/xml")
_, _ = smi.WriteTo(w)
})))
s.mux.HandleFunc("/api/pageview-timer", handlePageViewTimer)
return s, nil
}
const icon = "https://christine.website/static/img/avatar.png"

View File

@ -1,53 +0,0 @@
package main
import (
"encoding/json"
"io/ioutil"
"net/http"
"time"
"github.com/prometheus/client_golang/prometheus"
"within.website/ln"
)
var (
readTimes = prometheus.NewHistogramVec(prometheus.HistogramOpts{
Name: "blogpage_read_times",
Help: "This tracks how much time people spend reading articles on my blog",
}, []string{"path"})
)
func init() {
_ = prometheus.Register(readTimes)
}
func handlePageViewTimer(w http.ResponseWriter, r *http.Request) {
if r.Header.Get("DNT") == "1" {
http.NotFound(w, r)
return
}
data, err := ioutil.ReadAll(r.Body)
if err != nil {
ln.Error(r.Context(), err, ln.Info("while reading data"))
http.Error(w, "oopsie whoopsie uwu", http.StatusInternalServerError)
return
}
r.Body.Close()
type metricsData struct {
Path string `json:"path"`
StartTime time.Time `json:"start_time"`
EndTime time.Time `json:"end_time"`
}
var md metricsData
err = json.Unmarshal(data, &md)
if err != nil {
http.NotFound(w, r)
return
}
diff := md.EndTime.Sub(md.StartTime).Seconds()
readTimes.WithLabelValues(md.Path).Observe(float64(diff))
}

View File

@ -1,112 +0,0 @@
package main
import (
"context"
"fmt"
"net/http"
"os"
"sort"
"time"
"github.com/mxpv/patreon-go"
"golang.org/x/oauth2"
"within.website/ln"
)
func NewPatreonClient() (*patreon.Client, error) {
for _, name := range []string{"CLIENT_ID", "CLIENT_SECRET", "ACCESS_TOKEN", "REFRESH_TOKEN"} {
if os.Getenv("PATREON_"+name) == "" {
return nil, fmt.Errorf("wanted envvar PATREON_%s", name)
}
}
config := oauth2.Config{
ClientID: os.Getenv("PATREON_CLIENT_ID"),
ClientSecret: os.Getenv("PATREON_CLIENT_SECRET"),
Endpoint: oauth2.Endpoint{
AuthURL: patreon.AuthorizationURL,
TokenURL: patreon.AccessTokenURL,
},
Scopes: []string{"users", "campaigns", "pledges", "pledges-to-me", "my-campaign"},
}
token := oauth2.Token{
AccessToken: os.Getenv("PATREON_ACCESS_TOKEN"),
RefreshToken: os.Getenv("PATREON_REFRESH_TOKEN"),
// Must be non-nil, otherwise token will not be expired
Expiry: time.Now().Add(90 * 24 * time.Hour),
}
tc := config.Client(context.Background(), &token)
trans := tc.Transport
tc.Transport = lnLoggingTransport{next: trans}
client := patreon.NewClient(tc)
return client, nil
}
func GetPledges(pc *patreon.Client) ([]string, error) {
campaign, err := pc.FetchCampaign()
if err != nil {
return nil, fmt.Errorf("campaign fetch error: %w", err)
}
campaignID := campaign.Data[0].ID
cursor := ""
var result []string
for {
pledgesResponse, err := pc.FetchPledges(campaignID, patreon.WithPageSize(25), patreon.WithCursor(cursor))
if err != nil {
return nil, err
}
users := make(map[string]*patreon.User)
for _, item := range pledgesResponse.Included.Items {
u, ok := item.(*patreon.User)
if !ok {
continue
}
users[u.ID] = u
}
for _, pledge := range pledgesResponse.Data {
pid := pledge.Relationships.Patron.Data.ID
patronFullName := users[pid].Attributes.FullName
result = append(result, patronFullName)
}
cursor = pledgesResponse.Links.Next
if cursor == "" {
break
}
}
sort.Strings(result)
return result, nil
}
type lnLoggingTransport struct{ next http.RoundTripper }
func (l lnLoggingTransport) RoundTrip(r *http.Request) (*http.Response, error) {
ctx := r.Context()
f := ln.F{
"url": r.URL.String(),
"has_token": r.Header.Get("Authorization") != "",
}
resp, err := l.next.RoundTrip(r)
if err != nil {
return nil, err
}
f["status"] = resp.Status
ln.Log(ctx, f)
return resp, nil
}

View File

@ -1,91 +0,0 @@
package main
import (
"encoding/json"
"net/http"
"time"
"christine.website/cmd/site/internal"
"within.website/ln"
"within.website/ln/opname"
)
var bootTime = time.Now()
var etag = internal.Hash(bootTime.String(), IncrediblySecureSalt)
// IncrediblySecureSalt *******
const IncrediblySecureSalt = "hunter2"
func (s *Site) createFeed(w http.ResponseWriter, r *http.Request) {
ctx := opname.With(r.Context(), "rss-feed")
fetag := "W/" + internal.Hash(bootTime.String(), IncrediblySecureSalt)
w.Header().Set("ETag", fetag)
if r.Header.Get("If-None-Match") == fetag {
http.Error(w, "Cached data OK", http.StatusNotModified)
ln.Log(ctx, ln.Info("cache hit"))
return
}
w.Header().Set("Content-Type", "application/rss+xml")
err := s.rssFeed.WriteRss(w)
if err != nil {
http.Error(w, "Internal server error", http.StatusInternalServerError)
ln.Error(r.Context(), err, ln.F{
"remote_addr": r.RemoteAddr,
"action": "generating_rss",
"uri": r.RequestURI,
"host": r.Host,
})
}
}
func (s *Site) createAtom(w http.ResponseWriter, r *http.Request) {
ctx := opname.With(r.Context(), "atom-feed")
fetag := "W/" + internal.Hash(bootTime.String(), IncrediblySecureSalt)
w.Header().Set("ETag", fetag)
if r.Header.Get("If-None-Match") == fetag {
http.Error(w, "Cached data OK", http.StatusNotModified)
ln.Log(ctx, ln.Info("cache hit"))
return
}
w.Header().Set("Content-Type", "application/atom+xml")
err := s.rssFeed.WriteAtom(w)
if err != nil {
http.Error(w, "Internal server error", http.StatusInternalServerError)
ln.Error(ctx, err, ln.F{
"remote_addr": r.RemoteAddr,
"action": "generating_atom",
"uri": r.RequestURI,
"host": r.Host,
})
}
}
func (s *Site) createJSONFeed(w http.ResponseWriter, r *http.Request) {
ctx := opname.With(r.Context(), "atom-feed")
fetag := "W/" + internal.Hash(bootTime.String(), IncrediblySecureSalt)
w.Header().Set("ETag", fetag)
if r.Header.Get("If-None-Match") == fetag {
http.Error(w, "Cached data OK", http.StatusNotModified)
ln.Log(ctx, ln.Info("cache hit"))
return
}
w.Header().Set("Content-Type", "application/json")
e := json.NewEncoder(w)
e.SetIndent("", "\t")
err := e.Encode(s.jsonFeed)
if err != nil {
http.Error(w, "Internal server error", http.StatusInternalServerError)
ln.Error(ctx, err, ln.F{
"remote_addr": r.RemoteAddr,
"action": "generating_jsonfeed",
"uri": r.RequestURI,
"host": r.Host,
})
}
}

View File

@ -1,29 +0,0 @@
package main
import (
"io/ioutil"
"github.com/philandstuff/dhall-golang"
)
type Person struct {
Name string `dhall:"name"`
GitLink string `dhall:"gitLink"`
Twitter string `dhall:"twitter"`
Tags []string `dhall:"tags"`
}
func loadPeople(path string) ([]Person, error) {
data, err := ioutil.ReadFile(path)
if err != nil {
return nil, err
}
var people []Person
err = dhall.Unmarshal(data, &people)
if err != nil {
return nil, err
}
return people, nil
}

View File

@ -1,28 +0,0 @@
package main
import "testing"
func TestLoadPeople(t *testing.T) {
people, err := loadPeople("../../signalboost.dhall")
if err != nil {t.Fatal(err)}
for _, person := range people {
t.Run(person.Name, func(t *testing.T) {
if person.Name == "" {
t.Error("missing name")
}
if len(person.Tags) == 0 {
t.Error("missing tags")
}
if person.Twitter == "" {
t.Error("missing twitter")
}
if person.GitLink == "" {
t.Error("missing git link")
}
})
}
}

27
config.dhall Normal file
View File

@ -0,0 +1,27 @@
let Person =
{ Type = { name : Text, tags : List Text, gitLink : Text, twitter : Text }
, default =
{ name = "", tags = [] : List Text, gitLink = "", twitter = "" }
}
let defaultPort = env:PORT ? 3030
let Config =
{ Type =
{ signalboost : List Person.Type
, port : Natural
, clackSet : List Text
, resumeFname : Text
}
, default =
{ signalboost = [] : List Person.Type
, port = defaultPort
, clackSet = [ "Ashlynn" ]
, resumeFname = "./static/resume/resume.md"
}
}
in Config::{
, signalboost = ./signalboost.dhall
, clackSet = [ "Ashlynn", "Terry Davis", "Dennis Ritchie" ]
}

20
css/shim.css Normal file
View File

@ -0,0 +1,20 @@
.main {
padding: 20px 10px;
}
.hack h1 {
padding-top: 0;
}
footer.footer {
border-top: 1px solid #ccc;
margin-top: 80px;
margin-top: 5rem;
padding: 48px 0;
padding: 3rem 0;
}
img {
max-width: 100%;
padding: 1em;
}

View File

@ -1,6 +1,23 @@
{ }:
{ system ? builtins.currentSystem }:
let
sources = import ./nix/sources.nix;
pkgs = import sources.nixpkgs { };
in pkgs.callPackage ./site.nix { inherit pkgs; }
pkgs = import sources.nixpkgs { inherit system; };
callPackage = pkgs.lib.callPackageWith pkgs;
site = callPackage ./site.nix { };
dockerImage = pkg:
pkgs.dockerTools.buildLayeredImage {
name = "xena/christinewebsite";
tag = "latest";
contents = [ pkgs.cacert pkg ];
config = {
Cmd = [ "${pkg}/bin/xesite" ];
Env = [ "CONFIG_FNAME=${pkg}/config.dhall" "RUST_LOG=info" ];
WorkingDir = "/";
};
};
in dockerImage site

View File

@ -1,21 +0,0 @@
{ system ? builtins.currentSystem }:
let
pkgs = import (import ./nix/sources.nix).nixpkgs { inherit system; };
callPackage = pkgs.lib.callPackageWith pkgs;
site = callPackage ./site.nix { };
dockerImage = pkg:
pkgs.dockerTools.buildLayeredImage {
name = "xena/christinewebsite";
tag = pkg.version;
contents = [ pkg pkgs.cacert ];
config = {
Cmd = [ "/bin/site" ];
WorkingDir = "/";
};
};
in dockerImage site

21
go.mod
View File

@ -1,21 +0,0 @@
module christine.website
require (
github.com/celrenheit/sandflake v0.0.0-20190410195419-50a943690bc2
github.com/gorilla/feeds v1.1.1
github.com/joho/godotenv v1.3.0
github.com/mxpv/patreon-go v0.0.0-20190917022727-646111f1d983
github.com/philandstuff/dhall-golang v1.0.0
github.com/povilasv/prommod v0.0.12
github.com/prometheus/client_golang v1.7.1
github.com/russross/blackfriday v2.0.0+incompatible
github.com/sebest/xff v0.0.0-20160910043805-6c115e0ffa35
github.com/shurcooL/sanitized_anchor_name v1.0.0 // indirect
github.com/snabb/sitemap v1.0.0
github.com/stretchr/testify v1.6.1
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d
gopkg.in/yaml.v2 v2.3.0
within.website/ln v0.9.1
)
go 1.13

211
go.sum
View File

@ -1,211 +0,0 @@
cloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
github.com/alecthomas/units v0.0.0-20190717042225-c3de453c63f4/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
github.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=
github.com/beorn7/perks v1.0.0 h1:HWo1m869IqiPhD389kmkxeTalrjNbbJTC8LXupb+sl0=
github.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=
github.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
github.com/celrenheit/sandflake v0.0.0-20190410195419-50a943690bc2 h1:/BpnZPo/sk1vPlt62dLya5KCn7PN9ZBDrpTGlQzgUZI=
github.com/celrenheit/sandflake v0.0.0-20190410195419-50a943690bc2/go.mod h1:7L8gY0+4GYeBc9TvqVuDUq7tXuM6Sj7llnt7HkVwWlQ=
github.com/cespare/xxhash/v2 v2.1.1 h1:6MnRN8NT7+YBpUIWxHtefFZOKTAPgGjpQSxqLNn0+qY=
github.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=
github.com/go-kit/kit v0.8.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=
github.com/go-kit/kit v0.9.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=
github.com/go-logfmt/logfmt v0.3.0/go.mod h1:Qt1PoO58o5twSAckw1HlFXLmHsOX5/0LbT9GBnD5lWE=
github.com/go-logfmt/logfmt v0.4.0/go.mod h1:3RMwSq7FuexP4Kalkev3ejPJsZTpXXBr9+V4qmtdjCk=
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
github.com/gogo/protobuf v1.1.1/go.mod h1:r8qH/GZQm5c6nD/R0oafs1akxWv10x8SbQlK7atdtwQ=
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.3.1 h1:YF8+flBXS5eO826T4nzqPrxfhQThhXl0YzfuUPu4SBg=
github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.3.2 h1:6nsPYzhq5kReh6QImI3k5qWzO4PEbvbIW2cwSfR/6xs=
github.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.4.0-rc.1/go.mod h1:ceaxUfeHdC40wWswd/P6IGgMaK3YpKi5j83Wpe3EHw8=
github.com/golang/protobuf v1.4.0-rc.1.0.20200221234624-67d41d38c208/go.mod h1:xKAWHe0F5eneWXFV3EuXVDTCmh+JuBKY0li0aMyXATA=
github.com/golang/protobuf v1.4.0-rc.2/go.mod h1:LlEzMj4AhA7rCAGe4KMBDvJI+AwstrUpVNzEA03Pprs=
github.com/golang/protobuf v1.4.0-rc.4.0.20200313231945-b860323f09d0/go.mod h1:WU3c8KckQ9AFe+yFwt9sWVRKCVIyN9cPHBJSNnbL67w=
github.com/golang/protobuf v1.4.0 h1:oOuy+ugB+P/kBdUnG5QaMXSIyJ1q38wWSojYCb3z5VQ=
github.com/golang/protobuf v1.4.0/go.mod h1:jodUvKwWbYaEsadDk5Fwe5c77LiNKVO9IDvqG2KuDX0=
github.com/golang/protobuf v1.4.2 h1:+Z5KGCizgyZCbGh1KZqA0fcLLkwbsjIzS4aV2v7wJX0=
github.com/golang/protobuf v1.4.2/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI=
github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.3.1 h1:Xye71clBPdm5HgqGwUkwhbynsUJZhDbS20FvLhQ2izg=
github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.4.0 h1:xsAVV57WRhGj6kEIi8ReJzQlHHqcBYCElAvkovg3B/4=
github.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/gorilla/feeds v1.1.1 h1:HwKXxqzcRNg9to+BbvJog4+f3s/xzvtZXICcQGutYfY=
github.com/gorilla/feeds v1.1.1/go.mod h1:Nk0jZrvPFZX1OBe5NPiddPw7CfwF6Q9eqzaBbaightA=
github.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU=
github.com/joho/godotenv v1.3.0 h1:Zjp+RcGpHhGlrMbJzXTrZZPrWj+1vfm90La1wgB6Bhc=
github.com/joho/godotenv v1.3.0/go.mod h1:7hK45KPybAkOC6peb+G5yklZfMxEjkZhHbwpqxOKXbg=
github.com/json-iterator/go v1.1.6/go.mod h1:+SdeFBvtyEkXs7REEP0seUULqWtbJapLOCVDaaPEHmU=
github.com/json-iterator/go v1.1.9/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=
github.com/json-iterator/go v1.1.10/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=
github.com/julienschmidt/httprouter v1.2.0/go.mod h1:SYymIcj16QtmaHHD7aYtjjsJG7VTCxuUUipMqKk8s4w=
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/kr/logfmt v0.0.0-20140226030751-b84e30acd515/go.mod h1:+0opPa2QZZtGFBFZlji/RkVcI2GknAs/DXo4wKdlNEc=
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/leanovate/gopter v0.2.5-0.20190402064358-634a59d12406/go.mod h1:gNcbPWNEWRe4lm+bycKqxUYoH5uoVje5SkOJ3uoLer8=
github.com/matttproud/golang_protobuf_extensions v1.0.1 h1:4hp9jkHxhMHkqkrB3Ix0jegS5sx/RkqARlsWZ6pIwiU=
github.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5Ld7szi9bcBfOoFv/3dc6xSMkL2PC0=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=
github.com/modern-go/reflect2 v1.0.1/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=
github.com/mwitkow/go-conntrack v0.0.0-20161129095857-cc309e4a2223/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=
github.com/mxpv/patreon-go v0.0.0-20190917022727-646111f1d983 h1:r32TFg+FHLnoF8PCqCQNp+R9EjMBuP62FXkD/Eqp9Us=
github.com/mxpv/patreon-go v0.0.0-20190917022727-646111f1d983/go.mod h1:ksYjm2GAbGlgIP7jO9Q5/AdyE4MwwEbgQ+lFMx3hyiM=
github.com/onsi/ginkgo v1.6.0/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=
github.com/onsi/ginkgo v1.7.0/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=
github.com/onsi/gomega v1.4.3/go.mod h1:ex+gbHU/CVuBBDIJjb2X0qEXbFg53c61hWP/1CpauHY=
github.com/philandstuff/dhall-golang v1.0.0 h1:4iYE+OfVjpXtwB6todsw5w+rnBvAhufgpNzAo9K0ljw=
github.com/philandstuff/dhall-golang v1.0.0/go.mod h1:nYfzcKjqq6UDCStpXV6UxRwD0HX9IK9z/MuHmHghbEY=
github.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/povilasv/prommod v0.0.12 h1:0bk9QJ7kD6SmSsk9MeHhz5Qe6OpQl11Fvo7cvvmNUQM=
github.com/povilasv/prommod v0.0.12/go.mod h1:GnuK7wLoVBwZXj8bhbJNx/xFSldy7Q49A44RJKNM8XQ=
github.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=
github.com/prometheus/client_golang v1.0.0 h1:vrDKnkGzuGvhNAL56c7DBz29ZL+KxnoR0x7enabFceM=
github.com/prometheus/client_golang v1.0.0/go.mod h1:db9x61etRT2tGnBNRi70OPL5FsnadC4Ky3P0J6CfImo=
github.com/prometheus/client_golang v1.4.1 h1:FFSuS004yOQEtDdTq+TAOLP5xUq63KqAFYyOi8zA+Y8=
github.com/prometheus/client_golang v1.4.1/go.mod h1:e9GMxYsXl05ICDXkRhurwBS4Q3OK1iX/F2sw+iXX5zU=
github.com/prometheus/client_golang v1.5.0 h1:Ctq0iGpCmr3jeP77kbF2UxgvRwzWWz+4Bh9/vJTyg1A=
github.com/prometheus/client_golang v1.5.0/go.mod h1:e9GMxYsXl05ICDXkRhurwBS4Q3OK1iX/F2sw+iXX5zU=
github.com/prometheus/client_golang v1.5.1 h1:bdHYieyGlH+6OLEk2YQha8THib30KP0/yD0YH9m6xcA=
github.com/prometheus/client_golang v1.5.1/go.mod h1:e9GMxYsXl05ICDXkRhurwBS4Q3OK1iX/F2sw+iXX5zU=
github.com/prometheus/client_golang v1.6.0 h1:YVPodQOcK15POxhgARIvnDRVpLcuK8mglnMrWfyrw6A=
github.com/prometheus/client_golang v1.6.0/go.mod h1:ZLOG9ck3JLRdB5MgO8f+lLTe83AXG6ro35rLTxvnIl4=
github.com/prometheus/client_golang v1.7.0 h1:wCi7urQOGBsYcQROHqpUUX4ct84xp40t9R9JX0FuA/U=
github.com/prometheus/client_golang v1.7.0/go.mod h1:PY5Wy2awLA44sXw4AOSfFBetzPP4j5+D6mVACh+pe2M=
github.com/prometheus/client_golang v1.7.1 h1:NTGy1Ja9pByO+xAeH/qiWnLrKtr3hJPNjaVUwnjpdpA=
github.com/prometheus/client_golang v1.7.1/go.mod h1:PY5Wy2awLA44sXw4AOSfFBetzPP4j5+D6mVACh+pe2M=
github.com/prometheus/client_model v0.0.0-20180712105110-5c3871d89910/go.mod h1:MbSGuTsp3dbXC40dX6PRTWyKYBIrTGTE9sqQNg2J8bo=
github.com/prometheus/client_model v0.0.0-20190129233127-fd36f4220a90 h1:S/YWwWx/RA8rT8tKFRuGUZhuA90OyIBpPCXkcbwU8DE=
github.com/prometheus/client_model v0.0.0-20190129233127-fd36f4220a90/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/prometheus/client_model v0.2.0 h1:uq5h0d+GuxiXLJLNABMgp2qUWDPiLvgCzz2dUR+/W/M=
github.com/prometheus/client_model v0.2.0/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/prometheus/common v0.4.1 h1:K0MGApIoQvMw27RTdJkPbr3JZ7DNbtxQNyi5STVM6Kw=
github.com/prometheus/common v0.4.1/go.mod h1:TNfzLD0ON7rHzMJeJkieUDPYmFC7Snx/y86RQel1bk4=
github.com/prometheus/common v0.9.1 h1:KOMtN28tlbam3/7ZKEYKHhKoJZYYj3gMH4uc62x7X7U=
github.com/prometheus/common v0.9.1/go.mod h1:yhUN8i9wzaXS3w1O07YhxHEBxD+W35wd8bs7vj7HSQ4=
github.com/prometheus/common v0.10.0 h1:RyRA7RzGXQZiW+tGMr7sxa85G1z0yOpM1qq5c8lNawc=
github.com/prometheus/common v0.10.0/go.mod h1:Tlit/dnDKsSWFlCLTWaA1cyBgKHSMdTB80sz/V91rCo=
github.com/prometheus/procfs v0.0.0-20181005140218-185b4288413d/go.mod h1:c3At6R/oaqEKCNdg8wHV1ftS6bRYblBhIjjI8uT2IGk=
github.com/prometheus/procfs v0.0.2 h1:6LJUbpNm42llc4HRCuvApCSWB/WfhuNo9K98Q9sNGfs=
github.com/prometheus/procfs v0.0.2/go.mod h1:TjEm7ze935MbeOT/UhFTIMYKhuLP4wbCsTZCD3I8kEA=
github.com/prometheus/procfs v0.0.8 h1:+fpWZdT24pJBiqJdAwYBjPSk+5YmQzYNPYzQsdzLkt8=
github.com/prometheus/procfs v0.0.8/go.mod h1:7Qr8sr6344vo1JqZ6HhLceV9o3AJ1Ff+GxbHq6oeK9A=
github.com/prometheus/procfs v0.0.11 h1:DhHlBtkHWPYi8O2y31JkK0TF+DGM+51OopZjH/Ia5qI=
github.com/prometheus/procfs v0.0.11/go.mod h1:lV6e/gmhEcM9IjHGsFOCxxuZ+z1YqCvr4OA4YeYWdaU=
github.com/prometheus/procfs v0.1.3 h1:F0+tqvhOksq22sc6iCHF5WGlWjdwj92p0udFh1VFBS8=
github.com/prometheus/procfs v0.1.3/go.mod h1:lV6e/gmhEcM9IjHGsFOCxxuZ+z1YqCvr4OA4YeYWdaU=
github.com/russross/blackfriday v2.0.0+incompatible h1:cBXrhZNUf9C+La9/YpS+UHpUT8YD6Td9ZMSU9APFcsk=
github.com/russross/blackfriday v2.0.0+incompatible/go.mod h1:JO/DiYxRf+HjHt06OyowR9PTA263kcR/rfWxYHBV53g=
github.com/sebest/xff v0.0.0-20160910043805-6c115e0ffa35 h1:eajwn6K3weW5cd1ZXLu2sJ4pvwlBiCWY4uDejOr73gM=
github.com/sebest/xff v0.0.0-20160910043805-6c115e0ffa35/go.mod h1:wozgYq9WEBQBaIJe4YZ0qTSFAMxmcwBhQH0fO0R34Z0=
github.com/shurcooL/sanitized_anchor_name v1.0.0 h1:PdmoCO6wvbs+7yrJyMORt4/BmY5IYyJwS/kOiWx8mHo=
github.com/shurcooL/sanitized_anchor_name v1.0.0/go.mod h1:1NzhyTcUVG4SuEtjjoZeVRXNmyL/1OwPU0+IJeTBvfc=
github.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=
github.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=
github.com/snabb/diagio v1.0.0 h1:kovhQ1rDXoEbmpf/T5N2sUp2iOdxEg+TcqzbYVHV2V0=
github.com/snabb/diagio v1.0.0/go.mod h1:ZyGaWFhfBVqstGUw6laYetzeTwZ2xxVPqTALx1QQa1w=
github.com/snabb/sitemap v1.0.0 h1:7vJeNPAaaj7fQSRS3WYuJHzUjdnhLdSLLpvVtnhbzC0=
github.com/snabb/sitemap v1.0.0/go.mod h1:Id8uz1+WYdiNmSjEi4BIvL5UwNPYLsTHzRbjmDwNDzA=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.1.1 h1:2vfRuCMp5sSVIDSqO8oNnWJq7mPa6KVP3iPIwFBuy8A=
github.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.4.0 h1:2E4SXV/wtOkTonXsotYi4li6zVWxYlZuYNCXe9XRJyk=
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
github.com/stretchr/testify v1.5.1 h1:nOGnQDM7FYENwehXlg/kFVnos3rEvtKTjRvOWSzb6H4=
github.com/stretchr/testify v1.5.1/go.mod h1:5W2xD1RspED5o8YsWQXVCued0rvSQ+mT+I5cxcmMvtA=
github.com/stretchr/testify v1.6.0 h1:jlIyCplCJFULU/01vCkhKuTyc3OorI3bJFuw6obfgho=
github.com/stretchr/testify v1.6.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.6.1 h1:hDPOHmpOpP40lSULcqw7IrRb/u7w6RpDC9399XyoNd0=
github.com/stretchr/testify v1.6.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/ugorji/go v1.1.5-0.20190603013658-a2c9fa250719 h1:UW5IeyWBDAPQ+Qu1hT/lwtxL7pP3L+ETA8WuBvvvBWU=
github.com/ugorji/go v1.1.5-0.20190603013658-a2c9fa250719/go.mod h1:RaaajvHwnCbhlqWLTIB78hyPWp24YUXhQ3YXM7Hg7os=
golang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180811021610-c39426892332/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180906233101-161cd47e91fd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190108225652-1e06a53dbb7e/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20190613194153-d28f0bde5980 h1:dfGZHvZk057jK2MCeWus/TowKpJ8y4AmooUzdBSR9GU=
golang.org/x/net v0.0.0-20190613194153-d28f0bde5980/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d h1:TzXSXBo42m9gQenoE3b9BGiEpg5IG2JkU5FkPIawgtw=
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20190911185100-cd5d95a43a6e/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sys v0.0.0-20180905080454-ebe1bf3edb33/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20180909124046-d0be0721c37e/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20181116152217-5ac8a444bdc5/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a h1:1BGLXjeY4akVXGgbC9HugT3Jv3hCI0z56oJR5vAMgBU=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190422165155-953cdadca894/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200106162015-b016eb3dc98e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200122134326-e047566fdf82 h1:ywK/j/KkyTHcdyYSZNXGjMwgmDSfjglYZ3vStQ/gSCU=
golang.org/x/sys v0.0.0-20200122134326-e047566fdf82/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200420163511-1957bb5e6d1f h1:gWF768j/LaZugp8dyS4UwsslYCYz9XgFxvlgsn0n9H8=
golang.org/x/sys v0.0.0-20200420163511-1957bb5e6d1f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200615200032-f1bc736245b1 h1:ogLJMz+qpzav7lGMh10LMvAkM/fAoGlaiiHYiFYdm80=
golang.org/x/sys v0.0.0-20200615200032-f1bc736245b1/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543 h1:E7g+9GITq07hpfrRu66IVDexMakfv52eLZ2CXBWiKr4=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
google.golang.org/appengine v1.4.0 h1:/wp5JvzpHIxhs/dumFmF7BXTf3Z+dd4uXta4kVyO508=
google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=
google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=
google.golang.org/protobuf v0.0.0-20200228230310-ab0ca4ff8a60/go.mod h1:cfTl7dwQJ+fmap5saPgwCLgHXTUD7jkjRqWcaiX5VyM=
google.golang.org/protobuf v1.20.1-0.20200309200217-e05f789c0967/go.mod h1:A+miEFZTKqfCUM6K7xSMQL9OKL/b6hQv+e19PK+JZNE=
google.golang.org/protobuf v1.21.0 h1:qdOKuR/EIArgaWNjetjgTzgVTAZ+S/WXVrq9HW9zimw=
google.golang.org/protobuf v1.21.0/go.mod h1:47Nbq4nVaFHyn7ilMalzfO3qCViNmqZ2kzikPIcrTAo=
google.golang.org/protobuf v1.23.0 h1:4MY060fB1DLGMB/7MBTLnwQUY6+F09GEiz6SsrNqyzM=
google.golang.org/protobuf v1.23.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
gopkg.in/alecthomas/kingpin.v2 v2.2.6/go.mod h1:FMv+mEhP44yOT+4EoQTLFTRgOQ1FBLkstjWtayDeSgw=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15 h1:YR8cESwS4TdDjEe65xsg0ogRM/Nc3DYOhEAlW+xobZo=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/fsnotify.v1 v1.4.7/go.mod h1:Tz8NjZHkW78fSQdbUxIjBTcgA1z1m8ZHf0WmKUhAMys=
gopkg.in/tomb.v1 v1.0.0-20141024135613-dd632973f1e7/go.mod h1:dt/ZhP58zS4L8KSrWDmTeBkI65Dw0HsyUHuEVlX15mw=
gopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.2 h1:ZCJp+EgiOT7lHqUV2J862kp8Qj64Jo6az82+3Td9dZw=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.5/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.8 h1:obN1ZagJSUGI0Ek/LBmuj4SNLPfIny3KsKFopxRdj10=
gopkg.in/yaml.v2 v2.2.8/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.3.0 h1:clyUAQHOM3G0M3f5vQj7LuJrETvjVot3Z5el9nffUtU=
gopkg.in/yaml.v2 v2.3.0/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
within.website/ln v0.8.0 h1:NX6Eo3LkM9RU8lLRbWpmR5/jQRYKtZ8zuiYi7mmKa6w=
within.website/ln v0.8.0/go.mod h1:I+Apk8qxMStNXTZdyDMqDqe6CB8Hn6+W/Gyf5QbY+2E=
within.website/ln v0.9.0 h1:165zpOgw5Rq278x+u2j3o4662BW/pjavL0vsAzyumxk=
within.website/ln v0.9.0/go.mod h1:I+Apk8qxMStNXTZdyDMqDqe6CB8Hn6+W/Gyf5QbY+2E=
within.website/ln v0.9.1 h1:Qi8IjeCnU43jXijKtr5qtcbjuiCVAudOIxqTim7svnc=
within.website/ln v0.9.1/go.mod h1:I+Apk8qxMStNXTZdyDMqDqe6CB8Hn6+W/Gyf5QbY+2E=

View File

@ -1,17 +0,0 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/
language: go
go:
- 1.8.1
before_install:
- go get -t -v ./...
script:
- go test -race -coverprofile=coverage.txt -covermode=atomic
after_success:
- bash <(curl -s https://codecov.io/bash)

View File

@ -1,363 +0,0 @@
Mozilla Public License, version 2.0
1. Definitions
1.1. "Contributor"
means each individual or legal entity that creates, contributes to the
creation of, or owns Covered Software.
1.2. "Contributor Version"
means the combination of the Contributions of others (if any) used by a
Contributor and that particular Contributor's Contribution.
1.3. "Contribution"
means Covered Software of a particular Contributor.
1.4. "Covered Software"
means Source Code Form to which the initial Contributor has attached the
notice in Exhibit A, the Executable Form of such Source Code Form, and
Modifications of such Source Code Form, in each case including portions
thereof.
1.5. "Incompatible With Secondary Licenses"
means
a. that the initial Contributor has attached the notice described in
Exhibit B to the Covered Software; or
b. that the Covered Software was made available under the terms of
version 1.1 or earlier of the License, but not also under the terms of
a Secondary License.
1.6. "Executable Form"
means any form of the work other than Source Code Form.
1.7. "Larger Work"
means a work that combines Covered Software with other material, in a
separate file or files, that is not Covered Software.
1.8. "License"
means this document.
1.9. "Licensable"
means having the right to grant, to the maximum extent possible, whether
at the time of the initial grant or subsequently, any and all of the
rights conveyed by this License.
1.10. "Modifications"
means any of the following:
a. any file in Source Code Form that results from an addition to,
deletion from, or modification of the contents of Covered Software; or
b. any new file in Source Code Form that contains any Covered Software.
1.11. "Patent Claims" of a Contributor
means any patent claim(s), including without limitation, method,
process, and apparatus claims, in any patent Licensable by such
Contributor that would be infringed, but for the grant of the License,
by the making, using, selling, offering for sale, having made, import,
or transfer of either its Contributions or its Contributor Version.
1.12. "Secondary License"
means either the GNU General Public License, Version 2.0, the GNU Lesser
General Public License, Version 2.1, the GNU Affero General Public
License, Version 3.0, or any later versions of those licenses.
1.13. "Source Code Form"
means the form of the work preferred for making modifications.
1.14. "You" (or "Your")
means an individual or a legal entity exercising rights under this
License. For legal entities, "You" includes any entity that controls, is
controlled by, or is under common control with You. For purposes of this
definition, "control" means (a) the power, direct or indirect, to cause
the direction or management of such entity, whether by contract or
otherwise, or (b) ownership of more than fifty percent (50%) of the
outstanding shares or beneficial ownership of such entity.
2. License Grants and Conditions
2.1. Grants
Each Contributor hereby grants You a world-wide, royalty-free,
non-exclusive license:
a. under intellectual property rights (other than patent or trademark)
Licensable by such Contributor to use, reproduce, make available,
modify, display, perform, distribute, and otherwise exploit its
Contributions, either on an unmodified basis, with Modifications, or
as part of a Larger Work; and
b. under Patent Claims of such Contributor to make, use, sell, offer for
sale, have made, import, and otherwise transfer either its
Contributions or its Contributor Version.
2.2. Effective Date
The licenses granted in Section 2.1 with respect to any Contribution
become effective for each Contribution on the date the Contributor first
distributes such Contribution.
2.3. Limitations on Grant Scope
The licenses granted in this Section 2 are the only rights granted under
this License. No additional rights or licenses will be implied from the
distribution or licensing of Covered Software under this License.
Notwithstanding Section 2.1(b) above, no patent license is granted by a
Contributor:
a. for any code that a Contributor has removed from Covered Software; or
b. for infringements caused by: (i) Your and any other third party's
modifications of Covered Software, or (ii) the combination of its
Contributions with other software (except as part of its Contributor
Version); or
c. under Patent Claims infringed by Covered Software in the absence of
its Contributions.
This License does not grant any rights in the trademarks, service marks,
or logos of any Contributor (except as may be necessary to comply with
the notice requirements in Section 3.4).
2.4. Subsequent Licenses
No Contributor makes additional grants as a result of Your choice to
distribute the Covered Software under a subsequent version of this
License (see Section 10.2) or under the terms of a Secondary License (if
permitted under the terms of Section 3.3).
2.5. Representation
Each Contributor represents that the Contributor believes its
Contributions are its original creation(s) or it has sufficient rights to
grant the rights to its Contributions conveyed by this License.
2.6. Fair Use
This License is not intended to limit any rights You have under
applicable copyright doctrines of fair use, fair dealing, or other
equivalents.
2.7. Conditions
Sections 3.1, 3.2, 3.3, and 3.4 are conditions of the licenses granted in
Section 2.1.
3. Responsibilities
3.1. Distribution of Source Form
All distribution of Covered Software in Source Code Form, including any
Modifications that You create or to which You contribute, must be under
the terms of this License. You must inform recipients that the Source
Code Form of the Covered Software is governed by the terms of this
License, and how they can obtain a copy of this License. You may not
attempt to alter or restrict the recipients' rights in the Source Code
Form.
3.2. Distribution of Executable Form
If You distribute Covered Software in Executable Form then:
a. such Covered Software must also be made available in Source Code Form,
as described in Section 3.1, and You must inform recipients of the
Executable Form how they can obtain a copy of such Source Code Form by
reasonable means in a timely manner, at a charge no more than the cost
of distribution to the recipient; and
b. You may distribute such Executable Form under the terms of this
License, or sublicense it under different terms, provided that the
license for the Executable Form does not attempt to limit or alter the
recipients' rights in the Source Code Form under this License.
3.3. Distribution of a Larger Work
You may create and distribute a Larger Work under terms of Your choice,
provided that You also comply with the requirements of this License for
the Covered Software. If the Larger Work is a combination of Covered
Software with a work governed by one or more Secondary Licenses, and the
Covered Software is not Incompatible With Secondary Licenses, this
License permits You to additionally distribute such Covered Software
under the terms of such Secondary License(s), so that the recipient of
the Larger Work may, at their option, further distribute the Covered
Software under the terms of either this License or such Secondary
License(s).
3.4. Notices
You may not remove or alter the substance of any license notices
(including copyright notices, patent notices, disclaimers of warranty, or
limitations of liability) contained within the Source Code Form of the
Covered Software, except that You may alter any license notices to the
extent required to remedy known factual inaccuracies.
3.5. Application of Additional Terms
You may choose to offer, and to charge a fee for, warranty, support,
indemnity or liability obligations to one or more recipients of Covered
Software. However, You may do so only on Your own behalf, and not on
behalf of any Contributor. You must make it absolutely clear that any
such warranty, support, indemnity, or liability obligation is offered by
You alone, and You hereby agree to indemnify every Contributor for any
liability incurred by such Contributor as a result of warranty, support,
indemnity or liability terms You offer. You may include additional
disclaimers of warranty and limitations of liability specific to any
jurisdiction.
4. Inability to Comply Due to Statute or Regulation
If it is impossible for You to comply with any of the terms of this License
with respect to some or all of the Covered Software due to statute,
judicial order, or regulation then You must: (a) comply with the terms of
this License to the maximum extent possible; and (b) describe the
limitations and the code they affect. Such description must be placed in a
text file included with all distributions of the Covered Software under
this License. Except to the extent prohibited by statute or regulation,
such description must be sufficiently detailed for a recipient of ordinary
skill to be able to understand it.
5. Termination
5.1. The rights granted under this License will terminate automatically if You
fail to comply with any of its terms. However, if You become compliant,
then the rights granted under this License from a particular Contributor
are reinstated (a) provisionally, unless and until such Contributor
explicitly and finally terminates Your grants, and (b) on an ongoing
basis, if such Contributor fails to notify You of the non-compliance by
some reasonable means prior to 60 days after You have come back into
compliance. Moreover, Your grants from a particular Contributor are
reinstated on an ongoing basis if such Contributor notifies You of the
non-compliance by some reasonable means, this is the first time You have
received notice of non-compliance with this License from such
Contributor, and You become compliant prior to 30 days after Your receipt
of the notice.
5.2. If You initiate litigation against any entity by asserting a patent
infringement claim (excluding declaratory judgment actions,
counter-claims, and cross-claims) alleging that a Contributor Version
directly or indirectly infringes any patent, then the rights granted to
You by any and all Contributors for the Covered Software under Section
2.1 of this License shall terminate.
5.3. In the event of termination under Sections 5.1 or 5.2 above, all end user
license agreements (excluding distributors and resellers) which have been
validly granted by You or Your distributors under this License prior to
termination shall survive termination.
6. Disclaimer of Warranty
Covered Software is provided under this License on an "as is" basis,
without warranty of any kind, either expressed, implied, or statutory,
including, without limitation, warranties that the Covered Software is free
of defects, merchantable, fit for a particular purpose or non-infringing.
The entire risk as to the quality and performance of the Covered Software
is with You. Should any Covered Software prove defective in any respect,
You (not any Contributor) assume the cost of any necessary servicing,
repair, or correction. This disclaimer of warranty constitutes an essential
part of this License. No use of any Covered Software is authorized under
this License except under this disclaimer.
7. Limitation of Liability
Under no circumstances and under no legal theory, whether tort (including
negligence), contract, or otherwise, shall any Contributor, or anyone who
distributes Covered Software as permitted above, be liable to You for any
direct, indirect, special, incidental, or consequential damages of any
character including, without limitation, damages for lost profits, loss of
goodwill, work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses, even if such party shall have been
informed of the possibility of such damages. This limitation of liability
shall not apply to liability for death or personal injury resulting from
such party's negligence to the extent applicable law prohibits such
limitation. Some jurisdictions do not allow the exclusion or limitation of
incidental or consequential damages, so this exclusion and limitation may
not apply to You.
8. Litigation
Any litigation relating to this License may be brought only in the courts
of a jurisdiction where the defendant maintains its principal place of
business and such litigation shall be governed by laws of that
jurisdiction, without reference to its conflict-of-law provisions. Nothing
in this Section shall prevent a party's ability to bring cross-claims or
counter-claims.
9. Miscellaneous
This License represents the complete agreement concerning the subject
matter hereof. If any provision of this License is held to be
unenforceable, such provision shall be reformed only to the extent
necessary to make it enforceable. Any law or regulation which provides that
the language of a contract shall be construed against the drafter shall not
be used to construe this License against a Contributor.
10. Versions of the License
10.1. New Versions
Mozilla Foundation is the license steward. Except as provided in Section
10.3, no one other than the license steward has the right to modify or
publish new versions of this License. Each version will be given a
distinguishing version number.
10.2. Effect of New Versions
You may distribute the Covered Software under the terms of the version
of the License under which You originally received the Covered Software,
or under the terms of any subsequent version published by the license
steward.
10.3. Modified Versions
If you create software not governed by this License, and you want to
create a new license for such software, you may create and use a
modified version of this License if you rename the license and remove
any references to the name of the license steward (except to note that
such modified license differs from this License).
10.4. Distributing Source Code Form that is Incompatible With Secondary
Licenses If You choose to distribute Source Code Form that is
Incompatible With Secondary Licenses under the terms of this version of
the License, the notice described in Exhibit B of this License must be
attached.
Exhibit A - Source Code Form License Notice
This Source Code Form is subject to the
terms of the Mozilla Public License, v.
2.0. If a copy of the MPL was not
distributed with this file, You can
obtain one at
http://mozilla.org/MPL/2.0/.
If it is not possible or desirable to put the notice in a particular file,
then You may include the notice in a location (such as a LICENSE file in a
relevant directory) where a recipient would be likely to look for such a
notice.
You may add additional accurate notices of copyright ownership.
Exhibit B - "Incompatible With Secondary Licenses" Notice
This Source Code Form is "Incompatible
With Secondary Licenses", as defined by
the Mozilla Public License, v. 2.0.

View File

@ -1,8 +0,0 @@
# JSONFeed - Go Package to parse JSON Feed streams
[![Build Status](https://travis-ci.org/st3fan/jsonfeed.svg?branch=master)](https://travis-ci.org/st3fan/jsonfeed) [![Go Report Card](https://goreportcard.com/badge/github.com/st3fan/jsonfeed)](https://goreportcard.com/report/github.com/st3fan/jsonfeed) [![codecov](https://codecov.io/gh/st3fan/jsonfeed/branch/master/graph/badge.svg)](https://codecov.io/gh/st3fan/jsonfeed)
*Stefan Arentz, May 2017*
Work in progress. Minimal package to parse JSON Feed streams. Please file feature requests.

View File

@ -1,73 +0,0 @@
// This Source Code Form is subject to the terms of the Mozilla Public
// License, v. 2.0. If a copy of the MPL was not distributed with this
// file, You can obtain one at http://mozilla.org/MPL/2.0/
package jsonfeed
import (
"encoding/json"
"io"
"time"
)
const CurrentVersion = "https://jsonfeed.org/version/1"
type Item struct {
ID string `json:"id"`
URL string `json:"url"`
ExternalURL string `json:"external_url"`
Title string `json:"title"`
ContentHTML string `json:"content_html"`
ContentText string `json:"content_text"`
Summary string `json:"summary"`
Image string `json:"image"`
BannerImage string `json:"banner_image"`
DatePublished time.Time `json:"date_published"`
DateModified time.Time `json:"date_modified"`
Author Author `json:"author"`
Tags []string `json:"tags"`
}
type Author struct {
Name string `json:"name"`
URL string `json:"url"`
Avatar string `json:"avatar"`
}
type Hub struct {
Type string `json:"type"`
URL string `json:"url"`
}
type Attachment struct {
URL string `json:"url"`
MIMEType string `json:"mime_type"`
Title string `json:"title"`
SizeInBytes int64 `json:"size_in_bytes"`
DurationInSeconds int64 `json:"duration_in_seconds"`
}
type Feed struct {
Version string `json:"version"`
Title string `json:"title"`
HomePageURL string `json:"home_page_url"`
FeedURL string `json:"feed_url"`
Description string `json:"description"`
UserComment string `json:"user_comment"`
NextURL string `json:"next_url"`
Icon string `json:"icon"`
Favicon string `json:"favicon"`
Author Author `json:"author"`
Expired bool `json:"expired"`
Hubs []Hub `json:"hubs"`
Items []Item `json:"items"`
}
func Parse(r io.Reader) (Feed, error) {
var feed Feed
decoder := json.NewDecoder(r)
if err := decoder.Decode(&feed); err != nil {
return Feed{}, err
}
return feed, nil
}

View File

@ -1,42 +0,0 @@
// This Source Code Form is subject to the terms of the Mozilla Public
// License, v. 2.0. If a copy of the MPL was not distributed with this
// file, You can obtain one at http://mozilla.org/MPL/2.0/
package jsonfeed
import (
"os"
"testing"
"time"
"github.com/stretchr/testify/assert"
)
func TestParseSimple(t *testing.T) {
r, err := os.Open("testdata/feed.json")
assert.NoError(t, err, "Could not open testdata/feed.json")
feed, err := Parse(r)
assert.NoError(t, err, "Could not parse testdata/feed.json")
assert.Equal(t, "https://jsonfeed.org/version/1", feed.Version)
assert.Equal(t, "JSON Feed", feed.Title)
assert.Equal(t, "JSON Feed is a ...", feed.Description)
assert.Equal(t, "https://jsonfeed.org/", feed.HomePageURL)
assert.Equal(t, "https://jsonfeed.org/feed.json", feed.FeedURL)
assert.Equal(t, "This feed allows ...", feed.UserComment)
assert.Equal(t, "https://jsonfeed.org/graphics/icon.png", feed.Favicon)
assert.Equal(t, "Brent Simmons and Manton Reece", feed.Author.Name)
assert.Equal(t, 1, len(feed.Items))
assert.Equal(t, "https://jsonfeed.org/2017/05/17/announcing_json_feed", feed.Items[0].ID)
assert.Equal(t, "https://jsonfeed.org/2017/05/17/announcing_json_feed", feed.Items[0].URL)
assert.Equal(t, "Announcing JSON Feed", feed.Items[0].Title)
assert.Equal(t, "<p>We ...", feed.Items[0].ContentHTML)
datePublished, err := time.Parse("2006-01-02T15:04:05-07:00", "2017-05-17T08:02:12-07:00")
assert.NoError(t, err, "Could not parse timestamp")
assert.Equal(t, datePublished, feed.Items[0].DatePublished)
}

View File

@ -1,21 +0,0 @@
{
"version": "https://jsonfeed.org/version/1",
"title": "JSON Feed",
"description": "JSON Feed is a ...",
"home_page_url": "https://jsonfeed.org/",
"feed_url": "https://jsonfeed.org/feed.json",
"user_comment": "This feed allows ...",
"favicon": "https://jsonfeed.org/graphics/icon.png",
"author": {
"name": "Brent Simmons and Manton Reece"
},
"items": [
{
"id": "https://jsonfeed.org/2017/05/17/announcing_json_feed",
"url": "https://jsonfeed.org/2017/05/17/announcing_json_feed",
"title": "Announcing JSON Feed",
"content_html": "<p>We ...",
"date_published": "2017-05-17T08:02:12-07:00"
}
]
}

15
lib/go_vanity/Cargo.toml Normal file
View File

@ -0,0 +1,15 @@
[package]
name = "go_vanity"
version = "0.1.0"
authors = ["Christine Dodrill <me@christine.website>"]
edition = "2018"
build = "src/build.rs"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
mime = "0.3.0"
warp = "0.2"
[build-dependencies]
ructe = { version = "0.11", features = ["warp02"] }

View File

@ -0,0 +1,5 @@
use ructe::{Result, Ructe};
fn main() -> Result<()> {
Ructe::from_env()?.compile_templates("templates")
}

12
lib/go_vanity/src/lib.rs Normal file
View File

@ -0,0 +1,12 @@
use warp::{http::Response, Rejection, Reply};
use crate::templates::{RenderRucte};
include!(concat!(env!("OUT_DIR"), "/templates.rs"));
pub async fn gitea(pkg_name: &str, git_repo: &str) -> Result<impl Reply, Rejection> {
Response::builder().html(|o| templates::gitea_html(o, pkg_name, git_repo))
}
pub async fn github(pkg_name: &str, git_repo: &str) -> Result<impl Reply, Rejection> {
Response::builder().html(|o| templates::github_html(o, pkg_name, git_repo))
}

View File

@ -0,0 +1,14 @@
@(pkg_name: &str, git_repo: &str)
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
<meta name="go-import" content="@pkg_name git @git_repo">
<meta name="go-source" content="@pkg_name @git_repo @git_repo/src/master@{/dir@} @git_repo/src/master@{/dir@}/@{file@}#L@{line@}">
<meta http-equiv="refresh" content="0; url=https://godoc.org/@pkg_name">
</head>
<body>
Please see <a href="https://godoc.org/@pkg_name">here</a> for documentation on this package.
</body>
</html>

View File

@ -0,0 +1,14 @@
@(pkg_name: &str, git_repo: &str)
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
<meta name="go-import" content="@pkg_name git @git_repo">
<meta name="go-source" content="@pkg_name @git_repo @git_repo/tree/master@{/dir@} @git_repo/blob/master@{/dir@}/@{file@}#L@{line@}">
<meta http-equiv="refresh" content="0; url=https://godoc.org/@pkg_name">
</head>
<body>
Please see <a href="https://godoc.org/@pkg_name">here</a> for documentation on this package.
</body>
</html>

4
lib/jsonfeed/.gitignore vendored Normal file
View File

@ -0,0 +1,4 @@
target/
**/*.rs.bk
Cargo.lock
*.html

15
lib/jsonfeed/Cargo.toml Normal file
View File

@ -0,0 +1,15 @@
[package]
authors = ["Paul Woolcock <paul@woolcock.us>", "Christine Dodrill <me@christine.website>"]
description = "Parser for the JSONFeed (http://jsonfeed.org) specification\n"
documentation = "https://docs.rs/jsonfeed"
homepage = "https://github.com/pwoolcoc/jsonfeed"
license = "MIT/Apache-2.0"
name = "jsonfeed"
readme = "README.adoc"
version = "0.3.0"
[dependencies]
error-chain = "0.12"
serde = "1"
serde_derive = "1"
serde_json = "1"

201
lib/jsonfeed/LICENSE-APACHE Normal file
View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

25
lib/jsonfeed/LICENSE-MIT Normal file
View File

@ -0,0 +1,25 @@
Copyright (c) 2014 The Rust Project Developers
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

27
lib/jsonfeed/README.adoc Normal file
View File

@ -0,0 +1,27 @@
= JSON Feed Parser
[link=https://github.com/pwoolcoc/jsonfeed]
image::https://img.shields.io/crates/v/jsonfeed.svg[JSON Feed crate version]
This is a http://jsonfeed.org[JSON Feed] parser in Rust. Just a thin layer on top of `serde`, but it
provides serialization & deserialization, along with a Builder API for constructing feeds.
Note that this is alpha, I still need to add a lot of tests and a couple more features.
== Example
----
extern crate jsonfeed;
extern crate reqwest;
fn main() {
let resp = reqwest::get("https://example.com/feed.json").unwrap();
let feed = jsonfeed::from_reader(resp).unwrap();
println!("Feed title is: {}", feed.title);
}
----
TODO:
* Tests. Lots and lots of tests
* Implement ability to add, serialize, and deserialize custom attributes from the json feed spec

204
lib/jsonfeed/src/builder.rs Normal file
View File

@ -0,0 +1,204 @@
use std::default::Default;
use errors::*;
use feed::{Feed, Author, Attachment};
use item::{Content, Item};
/// Feed Builder
///
/// This is used to programmatically build up a Feed object,
/// which can be serialized later into a JSON string
pub struct Builder(Feed);
impl Builder {
pub fn new() -> Builder {
Builder(Feed::default())
}
pub fn title<I: Into<String>>(mut self, t: I) -> Builder {
self.0.title = t.into();
self
}
pub fn home_page_url<I: Into<String>>(mut self, url: I) -> Builder {
self.0.home_page_url = Some(url.into());
self
}
pub fn feed_url<I: Into<String>>(mut self, url: I) -> Builder {
self.0.feed_url = Some(url.into());
self
}
pub fn description<I: Into<String>>(mut self, desc: I) -> Builder {
self.0.description = Some(desc.into());
self
}
pub fn user_comment<I: Into<String>>(mut self, cmt: I) -> Builder {
self.0.user_comment = Some(cmt.into());
self
}
pub fn next_url<I: Into<String>>(mut self, url: I) -> Builder {
self.0.next_url = Some(url.into());
self
}
pub fn icon<I: Into<String>>(mut self, url: I) -> Builder {
self.0.icon = Some(url.into());
self
}
pub fn favicon<I: Into<String>>(mut self, url: I) -> Builder {
self.0.favicon = Some(url.into());
self
}
pub fn author(mut self, author: Author) -> Builder {
self.0.author = Some(author);
self
}
pub fn expired(mut self) -> Builder {
self.0.expired = Some(true);
self
}
pub fn item(mut self, item: Item) -> Builder {
self.0.items.push(item);
self
}
pub fn build(self) -> Feed {
self.0
}
}
/// Builder object for an item in a feed
pub struct ItemBuilder {
pub id: Option<String>,
pub url: Option<String>,
pub external_url: Option<String>,
pub title: Option<String>,
pub content: Option<Content>,
pub summary: Option<String>,
pub image: Option<String>,
pub banner_image: Option<String>,
pub date_published: Option<String>,
pub date_modified: Option<String>,
pub author: Option<Author>,
pub tags: Option<Vec<String>>,
pub attachments: Option<Vec<Attachment>>,
}
impl ItemBuilder {
pub fn new() -> ItemBuilder {
ItemBuilder {
id: None,
url: None,
external_url: None,
title: None,
content: None,
summary: None,
image: None,
banner_image: None,
date_published: None,
date_modified: None,
author: None,
tags: None,
attachments: None,
}
}
pub fn title<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.title = Some(i.into());
self
}
pub fn image<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.image = Some(i.into());
self
}
pub fn id<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.id = Some(i.into());
self
}
pub fn url<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.url = Some(i.into());
self
}
pub fn external_url<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.external_url = Some(i.into());
self
}
pub fn date_modified<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.date_modified = Some(i.into());
self
}
pub fn date_published<I: Into<String>>(mut self, i: I) -> ItemBuilder {
self.date_published = Some(i.into());
self
}
pub fn tags(mut self, tags: Vec<String>) -> ItemBuilder {
self.tags = Some(tags);
self
}
pub fn author(mut self, who: Author) -> ItemBuilder {
self.author = Some(who);
self
}
pub fn content_html<I: Into<String>>(mut self, i: I) -> ItemBuilder {
match self.content {
Some(Content::Text(t)) => {
self.content = Some(Content::Both(i.into(), t));
},
_ => {
self.content = Some(Content::Html(i.into()));
}
}
self
}
pub fn content_text<I: Into<String>>(mut self, i: I) -> ItemBuilder {
match self.content {
Some(Content::Html(s)) => {
self.content = Some(Content::Both(s, i.into()));
},
_ => {
self.content = Some(Content::Text(i.into()));
},
}
self
}
pub fn build(self) -> Result<Item> {
if self.id.is_none() || self.content.is_none() {
return Err("missing field 'id' or 'content_*'".into());
}
Ok(Item {
id: self.id.unwrap(),
url: self.url,
external_url: self.external_url,
title: self.title,
content: self.content.unwrap(),
summary: self.summary,
image: self.image,
banner_image: self.banner_image,
date_published: self.date_published,
date_modified: self.date_modified,
author: self.author,
tags: self.tags,
attachments: self.attachments
})
}
}

View File

@ -0,0 +1,7 @@
use serde_json;
error_chain!{
foreign_links {
Serde(serde_json::Error);
}
}

296
lib/jsonfeed/src/feed.rs Normal file
View File

@ -0,0 +1,296 @@
use std::default::Default;
use item::Item;
use builder::Builder;
const VERSION_1: &'static str = "https://jsonfeed.org/version/1";
/// Represents a single feed
///
/// # Examples
///
/// ```rust
/// // Serialize a feed object to a JSON string
///
/// # extern crate jsonfeed;
/// # use std::default::Default;
/// # use jsonfeed::Feed;
/// # fn main() {
/// let feed: Feed = Feed::default();
/// assert_eq!(
/// jsonfeed::to_string(&feed).unwrap(),
/// "{\"version\":\"https://jsonfeed.org/version/1\",\"title\":\"\",\"items\":[]}"
/// );
/// # }
/// ```
///
/// ```rust
/// // Deserialize a feed objects from a JSON String
///
/// # extern crate jsonfeed;
/// # use jsonfeed::Feed;
/// # fn main() {
/// let json = "{\"version\":\"https://jsonfeed.org/version/1\",\"title\":\"\",\"items\":[]}";
/// let feed: Feed = jsonfeed::from_str(&json).unwrap();
/// assert_eq!(
/// feed,
/// Feed::default()
/// );
/// # }
/// ```
#[derive(Debug, Clone, PartialEq, Deserialize, Serialize)]
pub struct Feed {
pub version: String,
pub title: String,
pub items: Vec<Item>,
#[serde(skip_serializing_if = "Option::is_none")]
pub home_page_url: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub feed_url: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub user_comment: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub next_url: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub icon: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub favicon: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub author: Option<Author>,
#[serde(skip_serializing_if = "Option::is_none")]
pub expired: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub hubs: Option<Vec<Hub>>,
}
impl Feed {
/// Used to construct a Feed object
pub fn builder() -> Builder {
Builder::new()
}
}
impl Default for Feed {
fn default() -> Feed {
Feed {
version: VERSION_1.to_string(),
title: "".to_string(),
items: vec![],
home_page_url: None,
feed_url: None,
description: None,
user_comment: None,
next_url: None,
icon: None,
favicon: None,
author: None,
expired: None,
hubs: None,
}
}
}
/// Represents an `attachment` for an item
#[derive(Debug, Clone, PartialEq, Deserialize, Serialize)]
pub struct Attachment {
url: String,
mime_type: String,
title: Option<String>,
size_in_bytes: Option<u64>,
duration_in_seconds: Option<u64>,
}
/// Represents an `author` in both a feed and a feed item
#[derive(Debug, Clone, PartialEq, Deserialize, Serialize)]
pub struct Author {
name: Option<String>,
url: Option<String>,
avatar: Option<String>,
}
impl Author {
pub fn new() -> Author {
Author {
name: None,
url: None,
avatar: None,
}
}
pub fn name<I: Into<String>>(mut self, name: I) -> Self {
self.name = Some(name.into());
self
}
pub fn url<I: Into<String>>(mut self, url: I) -> Self {
self.url = Some(url.into());
self
}
pub fn avatar<I: Into<String>>(mut self, avatar: I) -> Self {
self.avatar = Some(avatar.into());
self
}
}
/// Represents a `hub` for a feed
#[derive(Debug, Clone, PartialEq, Deserialize, Serialize)]
pub struct Hub {
#[serde(rename = "type")]
type_: String,
url: String,
}
#[cfg(test)]
mod tests {
use serde_json;
use std::default::Default;
use super::*;
#[test]
fn serialize_feed() {
let feed = Feed {
version: "https://jsonfeed.org/version/1".to_string(),
title: "some title".to_string(),
items: vec![],
home_page_url: None,
description: None,
expired: Some(true),
..Default::default()
};
assert_eq!(
serde_json::to_string(&feed).unwrap(),
r#"{"version":"https://jsonfeed.org/version/1","title":"some title","items":[],"expired":true}"#
);
}
#[test]
fn deserialize_feed() {
let json = r#"{"version":"https://jsonfeed.org/version/1","title":"some title","items":[]}"#;
let feed: Feed = serde_json::from_str(&json).unwrap();
let expected = Feed {
version: "https://jsonfeed.org/version/1".to_string(),
title: "some title".to_string(),
items: vec![],
..Default::default()
};
assert_eq!(
feed,
expected
);
}
#[test]
fn serialize_attachment() {
let attachment = Attachment {
url: "http://example.com".to_string(),
mime_type: "application/json".to_string(),
title: Some("some title".to_string()),
size_in_bytes: Some(1),
duration_in_seconds: Some(1),
};
assert_eq!(
serde_json::to_string(&attachment).unwrap(),
r#"{"url":"http://example.com","mime_type":"application/json","title":"some title","size_in_bytes":1,"duration_in_seconds":1}"#
);
}
#[test]
fn deserialize_attachment() {
let json = r#"{"url":"http://example.com","mime_type":"application/json","title":"some title","size_in_bytes":1,"duration_in_seconds":1}"#;
let attachment: Attachment = serde_json::from_str(&json).unwrap();
let expected = Attachment {
url: "http://example.com".to_string(),
mime_type: "application/json".to_string(),
title: Some("some title".to_string()),
size_in_bytes: Some(1),
duration_in_seconds: Some(1),
};
assert_eq!(
attachment,
expected
);
}
#[test]
fn serialize_author() {
let author = Author {
name: Some("bob jones".to_string()),
url: Some("http://example.com".to_string()),
avatar: Some("http://img.com/blah".to_string()),
};
assert_eq!(
serde_json::to_string(&author).unwrap(),
r#"{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"}"#
);
}
#[test]
fn deserialize_author() {
let json = r#"{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"}"#;
let author: Author = serde_json::from_str(&json).unwrap();
let expected = Author {
name: Some("bob jones".to_string()),
url: Some("http://example.com".to_string()),
avatar: Some("http://img.com/blah".to_string()),
};
assert_eq!(
author,
expected
);
}
#[test]
fn serialize_hub() {
let hub = Hub {
type_: "some-type".to_string(),
url: "http://example.com".to_string(),
};
assert_eq!(
serde_json::to_string(&hub).unwrap(),
r#"{"type":"some-type","url":"http://example.com"}"#
)
}
#[test]
fn deserialize_hub() {
let json = r#"{"type":"some-type","url":"http://example.com"}"#;
let hub: Hub = serde_json::from_str(&json).unwrap();
let expected = Hub {
type_: "some-type".to_string(),
url: "http://example.com".to_string(),
};
assert_eq!(
hub,
expected
);
}
#[test]
fn deser_podcast() {
let json = r#"{
"version": "https://jsonfeed.org/version/1",
"title": "Timetable",
"home_page_url": "http://timetable.manton.org/",
"items": [
{
"id": "http://timetable.manton.org/2017/04/episode-45-launch-week/",
"url": "http://timetable.manton.org/2017/04/episode-45-launch-week/",
"title": "Episode 45: Launch week",
"content_html": "Im rolling out early access to Micro.blog this week. I talk about how the first 2 days have gone, mistakes with TestFlight, and what to do next.",
"date_published": "2017-04-26T01:09:45+00:00",
"attachments": [
{
"url": "http://timetable.manton.org/podcast-download/139/episode-45-launch-week.mp3",
"mime_type": "audio/mpeg",
"size_in_bytes": 5236920
}
]
}
]
}"#;
serde_json::from_str::<Feed>(&json).expect("Failed to deserialize podcast feed");
}
}

493
lib/jsonfeed/src/item.rs Normal file
View File

@ -0,0 +1,493 @@
use std::fmt;
use std::default::Default;
use feed::{Author, Attachment};
use builder::ItemBuilder;
use serde::ser::{Serialize, Serializer, SerializeStruct};
use serde::de::{self, Deserialize, Deserializer, Visitor, MapAccess};
/// Represents the `content_html` and `content_text` attributes of an item
#[derive(Debug, Clone, PartialEq, Deserialize, Serialize)]
pub enum Content {
Html(String),
Text(String),
Both(String, String),
}
/// Represents an item in a feed
#[derive(Debug, Clone, PartialEq)]
pub struct Item {
pub id: String,
pub url: Option<String>,
pub external_url: Option<String>,
pub title: Option<String>,
pub content: Content,
pub summary: Option<String>,
pub image: Option<String>,
pub banner_image: Option<String>,
pub date_published: Option<String>, // todo DateTime objects?
pub date_modified: Option<String>,
pub author: Option<Author>,
pub tags: Option<Vec<String>>,
pub attachments: Option<Vec<Attachment>>,
}
impl Item {
pub fn builder() -> ItemBuilder {
ItemBuilder::new()
}
}
impl Default for Item {
fn default() -> Item {
Item {
id: "".to_string(),
url: None,
external_url: None,
title: None,
content: Content::Text("".into()),
summary: None,
image: None,
banner_image: None,
date_published: None,
date_modified: None,
author: None,
tags: None,
attachments: None,
}
}
}
impl Serialize for Item {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where S: Serializer
{
let mut state = serializer.serialize_struct("Item", 14)?;
state.serialize_field("id", &self.id)?;
if self.url.is_some() {
state.serialize_field("url", &self.url)?;
}
if self.external_url.is_some() {
state.serialize_field("external_url", &self.external_url)?;
}
if self.title.is_some() {
state.serialize_field("title", &self.title)?;
}
match self.content {
Content::Html(ref s) => {
state.serialize_field("content_html", s)?;
state.serialize_field("content_text", &None::<Option<&str>>)?;
},
Content::Text(ref s) => {
state.serialize_field("content_html", &None::<Option<&str>>)?;
state.serialize_field("content_text", s)?;
},
Content::Both(ref s, ref t) => {
state.serialize_field("content_html", s)?;
state.serialize_field("content_text", t)?;
},
};
if self.summary.is_some() {
state.serialize_field("summary", &self.summary)?;
}
if self.image.is_some() {
state.serialize_field("image", &self.image)?;
}
if self.banner_image.is_some() {
state.serialize_field("banner_image", &self.banner_image)?;
}
if self.date_published.is_some() {
state.serialize_field("date_published", &self.date_published)?;
}
if self.date_modified.is_some() {
state.serialize_field("date_modified", &self.date_modified)?;
}
if self.author.is_some() {
state.serialize_field("author", &self.author)?;
}
if self.tags.is_some() {
state.serialize_field("tags", &self.tags)?;
}
if self.attachments.is_some() {
state.serialize_field("attachments", &self.attachments)?;
}
state.end()
}
}
impl<'de> Deserialize<'de> for Item {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where D: Deserializer<'de>
{
enum Field {
Id,
Url,
ExternalUrl,
Title,
ContentHtml,
ContentText,
Summary,
Image,
BannerImage,
DatePublished,
DateModified,
Author,
Tags,
Attachments,
};
impl<'de> Deserialize<'de> for Field {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where D: Deserializer<'de>
{
struct FieldVisitor;
impl<'de> Visitor<'de> for FieldVisitor {
type Value = Field;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("non-expected field")
}
fn visit_str<E>(self, value: &str) -> Result<Field, E>
where E: de::Error
{
match value {
"id" => Ok(Field::Id),
"url" => Ok(Field::Url),
"external_url" => Ok(Field::ExternalUrl),
"title" => Ok(Field::Title),
"content_html" => Ok(Field::ContentHtml),
"content_text" => Ok(Field::ContentText),
"summary" => Ok(Field::Summary),
"image" => Ok(Field::Image),
"banner_image" => Ok(Field::BannerImage),
"date_published" => Ok(Field::DatePublished),
"date_modified" => Ok(Field::DateModified),
"author" => Ok(Field::Author),
"tags" => Ok(Field::Tags),
"attachments" => Ok(Field::Attachments),
_ => Err(de::Error::unknown_field(value, FIELDS)),
}
}
}
deserializer.deserialize_identifier(FieldVisitor)
}
}
struct ItemVisitor;
impl<'de> Visitor<'de> for ItemVisitor {
type Value = Item;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("non-expected thing")
}
fn visit_map<V>(self, mut map: V) -> Result<Item, V::Error>
where V: MapAccess<'de>
{
let mut id = None;
let mut url = None;
let mut external_url = None;
let mut title = None;
let mut content_html: Option<String> = None;
let mut content_text: Option<String> = None;
let mut summary = None;
let mut image = None;
let mut banner_image = None;
let mut date_published = None;
let mut date_modified = None;
let mut author = None;
let mut tags = None;
let mut attachments = None;
while let Some(key) = map.next_key()? {
match key {
Field::Id => {
if id.is_some() {
return Err(de::Error::duplicate_field("id"));
}
id = Some(map.next_value()?);
},
Field::Url => {
if url.is_some() {
return Err(de::Error::duplicate_field("url"));
}
url = map.next_value()?;
},
Field::ExternalUrl => {
if external_url.is_some() {
return Err(de::Error::duplicate_field("external_url"));
}
external_url = map.next_value()?;
},
Field::Title => {
if title.is_some() {
return Err(de::Error::duplicate_field("title"));
}
title = map.next_value()?;
},
Field::ContentHtml => {
if content_html.is_some() {
return Err(de::Error::duplicate_field("content_html"));
}
content_html = map.next_value()?;
},
Field::ContentText => {
if content_text.is_some() {
return Err(de::Error::duplicate_field("content_text"));
}
content_text = map.next_value()?;
},
Field::Summary => {
if summary.is_some() {
return Err(de::Error::duplicate_field("summary"));
}
summary = map.next_value()?;
},
Field::Image => {
if image.is_some() {
return Err(de::Error::duplicate_field("image"));
}
image = map.next_value()?;
},
Field::BannerImage => {
if banner_image.is_some() {
return Err(de::Error::duplicate_field("banner_image"));
}
banner_image = map.next_value()?;
},
Field::DatePublished => {
if date_published.is_some() {
return Err(de::Error::duplicate_field("date_published"));
}
date_published = map.next_value()?;
},
Field::DateModified => {
if date_modified.is_some() {
return Err(de::Error::duplicate_field("date_modified"));
}
date_modified = map.next_value()?;
},
Field::Author => {
if author.is_some() {
return Err(de::Error::duplicate_field("author"));
}
author = map.next_value()?;
},
Field::Tags => {
if tags.is_some() {
return Err(de::Error::duplicate_field("tags"));
}
tags = map.next_value()?;
},
Field::Attachments => {
if attachments.is_some() {
return Err(de::Error::duplicate_field("attachments"));
}
attachments = map.next_value()?;
},
}
}
let id = id.ok_or_else(|| de::Error::missing_field("id"))?;
let content = match (content_html, content_text) {
(Some(s), Some(t)) => {
Content::Both(s.to_string(), t.to_string())
},
(Some(s), _) => {
Content::Html(s.to_string())
},
(_, Some(t)) => {
Content::Text(t.to_string())
},
_ => return Err(de::Error::missing_field("content_html or content_text")),
};
Ok(Item {
id,
url,
external_url,
title,
content,
summary,
image,
banner_image,
date_published,
date_modified,
author,
tags,
attachments,
})
}
}
const FIELDS: &'static [&'static str] = &[
"id",
"url",
"external_url",
"title",
"content",
"summary",
"image",
"banner_image",
"date_published",
"date_modified",
"author",
"tags",
"attachments",
];
deserializer.deserialize_struct("Item", FIELDS, ItemVisitor)
}
}
#[cfg(test)]
mod tests {
use super::*;
use feed::Author;
use serde_json;
#[test]
#[allow(non_snake_case)]
fn serialize_item__content_html() {
let item = Item {
id: "1".into(),
url: Some("http://example.com/feed.json".into()),
external_url: Some("http://example.com/feed.json".into()),
title: Some("feed title".into()),
content: Content::Html("<p>content</p>".into()),
summary: Some("feed summary".into()),
image: Some("http://img.com/blah".into()),
banner_image: Some("http://img.com/blah".into()),
date_published: Some("2017-01-01 10:00:00".into()),
date_modified: Some("2017-01-01 10:00:00".into()),
author: Some(Author::new().name("bob jones").url("http://example.com").avatar("http://img.com/blah")),
tags: Some(vec!["json".into(), "feed".into()]),
attachments: Some(vec![]),
};
assert_eq!(
serde_json::to_string(&item).unwrap(),
r#"{"id":"1","url":"http://example.com/feed.json","external_url":"http://example.com/feed.json","title":"feed title","content_html":"<p>content</p>","content_text":null,"summary":"feed summary","image":"http://img.com/blah","banner_image":"http://img.com/blah","date_published":"2017-01-01 10:00:00","date_modified":"2017-01-01 10:00:00","author":{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"},"tags":["json","feed"],"attachments":[]}"#
);
}
#[test]
#[allow(non_snake_case)]
fn serialize_item__content_text() {
let item = Item {
id: "1".into(),
url: Some("http://example.com/feed.json".into()),
external_url: Some("http://example.com/feed.json".into()),
title: Some("feed title".into()),
content: Content::Text("content".into()),
summary: Some("feed summary".into()),
image: Some("http://img.com/blah".into()),
banner_image: Some("http://img.com/blah".into()),
date_published: Some("2017-01-01 10:00:00".into()),
date_modified: Some("2017-01-01 10:00:00".into()),
author: Some(Author::new().name("bob jones").url("http://example.com").avatar("http://img.com/blah")),
tags: Some(vec!["json".into(), "feed".into()]),
attachments: Some(vec![]),
};
assert_eq!(
serde_json::to_string(&item).unwrap(),
r#"{"id":"1","url":"http://example.com/feed.json","external_url":"http://example.com/feed.json","title":"feed title","content_html":null,"content_text":"content","summary":"feed summary","image":"http://img.com/blah","banner_image":"http://img.com/blah","date_published":"2017-01-01 10:00:00","date_modified":"2017-01-01 10:00:00","author":{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"},"tags":["json","feed"],"attachments":[]}"#
);
}
#[test]
#[allow(non_snake_case)]
fn serialize_item__content_both() {
let item = Item {
id: "1".into(),
url: Some("http://example.com/feed.json".into()),
external_url: Some("http://example.com/feed.json".into()),
title: Some("feed title".into()),
content: Content::Both("<p>content</p>".into(), "content".into()),
summary: Some("feed summary".into()),
image: Some("http://img.com/blah".into()),
banner_image: Some("http://img.com/blah".into()),
date_published: Some("2017-01-01 10:00:00".into()),
date_modified: Some("2017-01-01 10:00:00".into()),
author: Some(Author::new().name("bob jones").url("http://example.com").avatar("http://img.com/blah")),
tags: Some(vec!["json".into(), "feed".into()]),
attachments: Some(vec![]),
};
assert_eq!(
serde_json::to_string(&item).unwrap(),
r#"{"id":"1","url":"http://example.com/feed.json","external_url":"http://example.com/feed.json","title":"feed title","content_html":"<p>content</p>","content_text":"content","summary":"feed summary","image":"http://img.com/blah","banner_image":"http://img.com/blah","date_published":"2017-01-01 10:00:00","date_modified":"2017-01-01 10:00:00","author":{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"},"tags":["json","feed"],"attachments":[]}"#
);
}
#[test]
#[allow(non_snake_case)]
fn deserialize_item__content_html() {
let json = r#"{"id":"1","url":"http://example.com/feed.json","external_url":"http://example.com/feed.json","title":"feed title","content_html":"<p>content</p>","content_text":null,"summary":"feed summary","image":"http://img.com/blah","banner_image":"http://img.com/blah","date_published":"2017-01-01 10:00:00","date_modified":"2017-01-01 10:00:00","author":{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"},"tags":["json","feed"],"attachments":[]}"#;
let item: Item = serde_json::from_str(&json).unwrap();
let expected = Item {
id: "1".into(),
url: Some("http://example.com/feed.json".into()),
external_url: Some("http://example.com/feed.json".into()),
title: Some("feed title".into()),
content: Content::Html("<p>content</p>".into()),
summary: Some("feed summary".into()),
image: Some("http://img.com/blah".into()),
banner_image: Some("http://img.com/blah".into()),
date_published: Some("2017-01-01 10:00:00".into()),
date_modified: Some("2017-01-01 10:00:00".into()),
author: Some(Author::new().name("bob jones").url("http://example.com").avatar("http://img.com/blah")),
tags: Some(vec!["json".into(), "feed".into()]),
attachments: Some(vec![]),
};
assert_eq!(item, expected);
}
#[test]
#[allow(non_snake_case)]
fn deserialize_item__content_text() {
let json = r#"{"id":"1","url":"http://example.com/feed.json","external_url":"http://example.com/feed.json","title":"feed title","content_html":null,"content_text":"content","summary":"feed summary","image":"http://img.com/blah","banner_image":"http://img.com/blah","date_published":"2017-01-01 10:00:00","date_modified":"2017-01-01 10:00:00","author":{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"},"tags":["json","feed"],"attachments":[]}"#;
let item: Item = serde_json::from_str(&json).unwrap();
let expected = Item {
id: "1".into(),
url: Some("http://example.com/feed.json".into()),
external_url: Some("http://example.com/feed.json".into()),
title: Some("feed title".into()),
content: Content::Text("content".into()),
summary: Some("feed summary".into()),
image: Some("http://img.com/blah".into()),
banner_image: Some("http://img.com/blah".into()),
date_published: Some("2017-01-01 10:00:00".into()),
date_modified: Some("2017-01-01 10:00:00".into()),
author: Some(Author::new().name("bob jones").url("http://example.com").avatar("http://img.com/blah")),
tags: Some(vec!["json".into(), "feed".into()]),
attachments: Some(vec![]),
};
assert_eq!(item, expected);
}
#[test]
#[allow(non_snake_case)]
fn deserialize_item__content_both() {
let json = r#"{"id":"1","url":"http://example.com/feed.json","external_url":"http://example.com/feed.json","title":"feed title","content_html":"<p>content</p>","content_text":"content","summary":"feed summary","image":"http://img.com/blah","banner_image":"http://img.com/blah","date_published":"2017-01-01 10:00:00","date_modified":"2017-01-01 10:00:00","author":{"name":"bob jones","url":"http://example.com","avatar":"http://img.com/blah"},"tags":["json","feed"],"attachments":[]}"#;
let item: Item = serde_json::from_str(&json).unwrap();
let expected = Item {
id: "1".into(),
url: Some("http://example.com/feed.json".into()),
external_url: Some("http://example.com/feed.json".into()),
title: Some("feed title".into()),
content: Content::Both("<p>content</p>".into(), "content".into()),
summary: Some("feed summary".into()),
image: Some("http://img.com/blah".into()),
banner_image: Some("http://img.com/blah".into()),
date_published: Some("2017-01-01 10:00:00".into()),
date_modified: Some("2017-01-01 10:00:00".into()),
author: Some(Author::new().name("bob jones").url("http://example.com").avatar("http://img.com/blah")),
tags: Some(vec!["json".into(), "feed".into()]),
attachments: Some(vec![]),
};
assert_eq!(item, expected);
}
}

252
lib/jsonfeed/src/lib.rs Normal file
View File

@ -0,0 +1,252 @@
//! JSON Feed is a syndication format similar to ATOM and RSS, using JSON
//! instead of XML
//!
//! This crate can serialize and deserialize between JSON Feed strings
//! and Rust data structures. It also allows for programmatically building
//! a JSON Feed
//!
//! Example:
//!
//! ```rust
//! extern crate jsonfeed;
//!
//! use jsonfeed::{Feed, Item};
//!
//! fn run() -> Result<(), jsonfeed::Error> {
//! let j = r#"{
//! "title": "my feed",
//! "version": "https://jsonfeed.org/version/1",
//! "items": []
//! }"#;
//! let feed = jsonfeed::from_str(j).unwrap();
//!
//! let new_feed = Feed::builder()
//! .title("some other feed")
//! .item(Item::builder()
//! .title("some item title")
//! .content_html("<p>Hello, World</p>")
//! .build()?)
//! .item(Item::builder()
//! .title("some other item title")
//! .content_text("Hello, World!")
//! .build()?)
//! .build();
//! println!("{}", jsonfeed::to_string(&new_feed).unwrap());
//! Ok(())
//! }
//! fn main() {
//! let _ = run();
//! }
//! ```
extern crate serde;
#[macro_use] extern crate error_chain;
#[macro_use] extern crate serde_derive;
extern crate serde_json;
mod errors;
mod item;
mod feed;
mod builder;
pub use errors::*;
pub use item::*;
pub use feed::{Feed, Author, Attachment};
use std::io::Write;
/// Attempts to convert a string slice to a Feed object
///
/// Example
///
/// ```rust
/// # extern crate jsonfeed;
/// # use jsonfeed::Feed;
/// # use std::default::Default;
/// # fn main() {
/// let json = r#"{"version": "https://jsonfeed.org/version/1", "title": "", "items": []}"#;
/// let feed: Feed = jsonfeed::from_str(&json).unwrap();
///
/// assert_eq!(feed, Feed::default());
/// # }
/// ```
pub fn from_str(s: &str) -> Result<Feed> {
Ok(serde_json::from_str(s)?)
}
/// Deserialize a Feed object from an IO stream of JSON
pub fn from_reader<R: ::std::io::Read>(r: R) -> Result<Feed> {
Ok(serde_json::from_reader(r)?)
}
/// Deserialize a Feed object from bytes of JSON text
pub fn from_slice<'a>(v: &'a [u8]) -> Result<Feed> {
Ok(serde_json::from_slice(v)?)
}
/// Convert a serde_json::Value type to a Feed object
pub fn from_value(value: serde_json::Value) -> Result<Feed> {
Ok(serde_json::from_value(value)?)
}
/// Serialize a Feed to a JSON Feed string
pub fn to_string(value: &Feed) -> Result<String> {
Ok(serde_json::to_string(value)?)
}
/// Pretty-print a Feed to a JSON Feed string
pub fn to_string_pretty(value: &Feed) -> Result<String> {
Ok(serde_json::to_string_pretty(value)?)
}
/// Convert a Feed to a serde_json::Value
pub fn to_value(value: Feed) -> Result<serde_json::Value> {
Ok(serde_json::to_value(value)?)
}
/// Convert a Feed to a vector of bytes of JSON
pub fn to_vec(value: &Feed) -> Result<Vec<u8>> {
Ok(serde_json::to_vec(value)?)
}
/// Convert a Feed to a vector of bytes of pretty-printed JSON
pub fn to_vec_pretty(value: &Feed) -> Result<Vec<u8>> {
Ok(serde_json::to_vec_pretty(value)?)
}
/// Serialize a Feed to JSON and output to an IO stream
pub fn to_writer<W>(writer: W, value: &Feed) -> Result<()>
where W: Write
{
Ok(serde_json::to_writer(writer, value)?)
}
/// Serialize a Feed to pretty-printed JSON and output to an IO stream
pub fn to_writer_pretty<W>(writer: W, value: &Feed) -> Result<()>
where W: Write
{
Ok(serde_json::to_writer_pretty(writer, value)?)
}
#[cfg(test)]
mod tests {
use super::*;
use std::io::Cursor;
#[test]
fn from_str() {
let feed = r#"{"version": "https://jsonfeed.org/version/1","title":"","items":[]}"#;
let expected = Feed::default();
assert_eq!(
super::from_str(&feed).unwrap(),
expected
);
}
#[test]
fn from_reader() {
let feed = r#"{"version": "https://jsonfeed.org/version/1","title":"","items":[]}"#;
let feed = feed.as_bytes();
let feed = Cursor::new(feed);
let expected = Feed::default();
assert_eq!(
super::from_reader(feed).unwrap(),
expected
);
}
#[test]
fn from_slice() {
let feed = r#"{"version": "https://jsonfeed.org/version/1","title":"","items":[]}"#;
let feed = feed.as_bytes();
let expected = Feed::default();
assert_eq!(
super::from_slice(&feed).unwrap(),
expected
);
}
#[test]
fn from_value() {
let feed = r#"{"version": "https://jsonfeed.org/version/1","title":"","items":[]}"#;
let feed: serde_json::Value = serde_json::from_str(&feed).unwrap();
let expected = Feed::default();
assert_eq!(
super::from_value(feed).unwrap(),
expected
);
}
#[test]
fn to_string() {
let feed = Feed::default();
let expected = r#"{"version":"https://jsonfeed.org/version/1","title":"","items":[]}"#;
assert_eq!(
super::to_string(&feed).unwrap(),
expected
);
}
#[test]
fn to_string_pretty() {
let feed = Feed::default();
let expected = r#"{
"version": "https://jsonfeed.org/version/1",
"title": "",
"items": []
}"#;
assert_eq!(
super::to_string_pretty(&feed).unwrap(),
expected
);
}
#[test]
fn to_value() {
let feed = r#"{"version":"https://jsonfeed.org/version/1","title":"","items":[]}"#;
let expected: serde_json::Value = serde_json::from_str(&feed).unwrap();
assert_eq!(
super::to_value(Feed::default()).unwrap(),
expected
);
}
#[test]
fn to_vec() {
let feed = r#"{"version":"https://jsonfeed.org/version/1","title":"","items":[]}"#;
let expected = feed.as_bytes();
assert_eq!(
super::to_vec(&Feed::default()).unwrap(),
expected
);
}
#[test]
fn to_vec_pretty() {
let feed = r#"{
"version": "https://jsonfeed.org/version/1",
"title": "",
"items": []
}"#;
let expected = feed.as_bytes();
assert_eq!(
super::to_vec_pretty(&Feed::default()).unwrap(),
expected
);
}
#[test]
fn to_writer() {
let feed = r#"{"version":"https://jsonfeed.org/version/1","title":"","items":[]}"#;
let feed = feed.as_bytes();
let mut writer = Cursor::new(Vec::with_capacity(feed.len()));
super::to_writer(&mut writer, &Feed::default()).expect("Could not write to writer");
let result = writer.into_inner();
assert_eq!(result, feed);
}
#[test]
fn to_writer_pretty() {
let feed = r#"{
"version": "https://jsonfeed.org/version/1",
"title": "",
"items": []
}"#;
let feed = feed.as_bytes();
let mut writer = Cursor::new(Vec::with_capacity(feed.len()));
super::to_writer_pretty(&mut writer, &Feed::default()).expect("Could not write to writer");
let result = writer.into_inner();
assert_eq!(result, feed);
}
}

1
lib/patreon/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
.env

20
lib/patreon/Cargo.toml Normal file
View File

@ -0,0 +1,20 @@
[package]
name = "patreon"
version = "0.1.0"
authors = ["Christine Dodrill <me@christine.website>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
chrono = { version = "0.4", features = ["serde"] }
reqwest = { version = "0.10", features = ["json"] }
serde_json = "1.0"
serde = { version = "1", features = ["derive"] }
thiserror = "1"
log = "0"
[dev-dependencies]
tokio = { version = "0.2", features = ["macros"] }
envy = "0.4"
pretty_env_logger = "0"

View File

@ -0,0 +1,17 @@
use patreon::*;
#[tokio::main]
async fn main() -> Result<()> {
pretty_env_logger::init();
let creds: Credentials = envy::prefixed("PATREON_").from_env().unwrap();
let cli = Client::new(creds);
let camp = cli.campaign().await?;
println!("{:#?}", camp);
let id = camp.data[0].id.clone();
let pledges = cli.pledges(id).await?;
println!("{:#?}", pledges);
Ok(())
}

158
lib/patreon/src/lib.rs Normal file
View File

@ -0,0 +1,158 @@
#[macro_use]
extern crate log;
use serde::{Deserialize, Serialize};
use thiserror::Error;
use chrono::prelude::*;
pub type Campaigns = Vec<Object<Campaign>>;
pub type Pledges = Vec<Object<Pledge>>;
pub type Users = Vec<Object<User>>;
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Campaign {
pub summary: String,
pub creation_name: String,
pub display_patron_goals: bool,
pub pay_per_name: String,
pub one_liner: Option<String>,
pub main_video_embed: Option<String>,
pub main_video_url: Option<String>,
pub image_small_url: String,
pub image_url: String,
pub thanks_video_url: Option<String>,
pub thanks_embed: Option<String>,
pub thanks_msg: String,
pub is_charged_immediately: bool,
pub is_monthly: bool,
pub is_nsfw: bool,
pub is_plural: bool,
pub created_at: DateTime<Utc>,
pub published_at: DateTime<Utc>,
pub pledge_url: String,
pub pledge_sum: i32,
pub patron_count: u32,
pub creation_count: u32,
pub outstanding_payment_amount_cents: u64,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Pledge {
pub amount_cents: u32,
pub created_at: String,
pub declined_since: Option<String>,
pub pledge_cap_cents: u32,
pub patron_pays_fees: bool,
pub total_historical_amount_cents: Option<u32>,
pub is_paused: Option<bool>,
pub has_shipping_address: Option<bool>,
pub outstanding_payment_amount_cents: Option<u32>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct User {
pub first_name: String,
pub last_name: String,
pub full_name: String,
pub vanity: Option<String>,
pub about: Option<String>,
pub gender: i32,
pub image_url: String,
pub thumb_url: String,
pub created: DateTime<Utc>,
pub url: String,
}
pub type Result<T> = std::result::Result<T, Error>;
#[derive(Error, Debug)]
pub enum Error {
#[error("json error: {0:?}")]
Json(#[from] serde_json::Error),
#[error("request error: {0:?}")]
Request(#[from] reqwest::Error),
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Credentials {
pub client_id: String,
pub client_secret: String,
pub access_token: String,
pub refresh_token: String,
}
pub struct Client {
cli: reqwest::Client,
base_url: String,
creds: Credentials,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Data<T, U> {
pub data: T,
pub included: Option<Vec<U>>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Object<T> {
pub id: String,
pub attributes: T,
pub r#type: String,
pub links: Option<Links>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Links {
related: String,
}
impl Client {
pub fn new(creds: Credentials) -> Self {
Self {
cli: reqwest::Client::new(),
base_url: "https://api.patreon.com".into(),
creds: creds,
}
}
pub async fn campaign(&self) -> Result<Data<Vec<Object<Campaign>>, ()>> {
let data = self
.cli
.get(&format!(
"{}/oauth2/api/current_user/campaigns",
self.base_url
))
.query(&[("include", "patron.null"), ("includes", "")])
.header(
"Authorization",
format!("Bearer {}", self.creds.access_token),
)
.send()
.await?
.error_for_status()?.text().await?;
log::debug!("campaign response: {}", data);
Ok(serde_json::from_str(&data)?)
}
pub async fn pledges(&self, camp_id: String) -> Result<Vec<Object<User>>> {
let data = self
.cli
.get(&format!(
"{}/oauth2/api/campaigns/{}/pledges",
self.base_url, camp_id
))
.query(&[("include", "patron.null")])
.header(
"Authorization",
format!("Bearer {}", self.creds.access_token),
)
.send()
.await?
.error_for_status()?
.text()
.await?;
log::debug!("pledges for {}: {}", camp_id, data);
let data : Data<Vec<Object<Pledge>>, Object<User>> = serde_json::from_str(&data)?;
Ok(data.included.unwrap())
}
}

View File

@ -1,588 +0,0 @@
# file generated from go.mod using vgo2nix (https://github.com/adisbladis/vgo2nix)
[
{
goPackagePath = "cloud.google.com/go";
fetch = {
type = "git";
url = "https://code.googlesource.com/gocloud";
rev = "v0.34.0";
sha256 = "1kclgclwar3r37zbvb9gg3qxbgzkb50zk3s9778zlh2773qikmai";
};
}
{
goPackagePath = "github.com/alecthomas/template";
fetch = {
type = "git";
url = "https://github.com/alecthomas/template";
rev = "fb15b899a751";
sha256 = "1vlasv4dgycydh5wx6jdcvz40zdv90zz1h7836z7lhsi2ymvii26";
};
}
{
goPackagePath = "github.com/alecthomas/units";
fetch = {
type = "git";
url = "https://github.com/alecthomas/units";
rev = "c3de453c63f4";
sha256 = "0js37zlgv37y61j4a2d46jh72xm5kxmpaiw0ya9v944bjpc386my";
};
}
{
goPackagePath = "github.com/beorn7/perks";
fetch = {
type = "git";
url = "https://github.com/beorn7/perks";
rev = "v1.0.1";
sha256 = "17n4yygjxa6p499dj3yaqzfww2g7528165cl13haj97hlx94dgl7";
};
}
{
goPackagePath = "github.com/celrenheit/sandflake";
fetch = {
type = "git";
url = "https://github.com/celrenheit/sandflake";
rev = "50a943690bc2";
sha256 = "0ji76y79xqlx60bfxcik8zy5ha4gzhhi9qw020dkwqhbnbpaj6w2";
};
}
{
goPackagePath = "github.com/cespare/xxhash";
fetch = {
type = "git";
url = "https://github.com/cespare/xxhash";
rev = "v2.1.1";
sha256 = "0rl5rs8546zj1vzggv38w93wx0b5dvav7yy5hzxa8kw7iikv1cgr";
};
}
{
goPackagePath = "github.com/davecgh/go-spew";
fetch = {
type = "git";
url = "https://github.com/davecgh/go-spew";
rev = "v1.1.1";
sha256 = "0hka6hmyvp701adzag2g26cxdj47g21x6jz4sc6jjz1mn59d474y";
};
}
{
goPackagePath = "github.com/fsnotify/fsnotify";
fetch = {
type = "git";
url = "https://github.com/fsnotify/fsnotify";
rev = "v1.4.7";
sha256 = "07va9crci0ijlivbb7q57d2rz9h27zgn2fsm60spjsqpdbvyrx4g";
};
}
{
goPackagePath = "github.com/go-kit/kit";
fetch = {
type = "git";
url = "https://github.com/go-kit/kit";
rev = "v0.9.0";
sha256 = "09038mnw705h7isbjp8dzgp2i04bp5rqkmifxvwc5xkh75s00qpw";
};
}
{
goPackagePath = "github.com/go-logfmt/logfmt";
fetch = {
type = "git";
url = "https://github.com/go-logfmt/logfmt";
rev = "v0.4.0";
sha256 = "06smxc112xmixz78nyvk3b2hmc7wasf2sl5vxj1xz62kqcq9lzm9";
};
}
{
goPackagePath = "github.com/go-stack/stack";
fetch = {
type = "git";
url = "https://github.com/go-stack/stack";
rev = "v1.8.0";
sha256 = "0wk25751ryyvxclyp8jdk5c3ar0cmfr8lrjb66qbg4808x66b96v";
};
}
{
goPackagePath = "github.com/gogo/protobuf";
fetch = {
type = "git";
url = "https://github.com/gogo/protobuf";
rev = "v1.1.1";
sha256 = "1525pq7r6h3s8dncvq8gxi893p2nq8dxpzvq0nfl5b4p6mq0v1c2";
};
}
{
goPackagePath = "github.com/golang/protobuf";
fetch = {
type = "git";
url = "https://github.com/golang/protobuf";
rev = "v1.4.0";
sha256 = "1fjvl5n77abxz5qsd4mgyvjq19x43c5bfvmq62mq3m5plx6zksc8";
};
}
{
goPackagePath = "github.com/google/go-cmp";
fetch = {
type = "git";
url = "https://github.com/google/go-cmp";
rev = "v0.4.0";
sha256 = "1x5pvl3fb5sbyng7i34431xycnhmx8xx94gq2n19g6p0vz68z2v2";
};
}
{
goPackagePath = "github.com/google/gofuzz";
fetch = {
type = "git";
url = "https://github.com/google/gofuzz";
rev = "v1.0.0";
sha256 = "0qz439qvccm91w0mmjz4fqgx48clxdwagkvvx89cr43q1d4iry36";
};
}
{
goPackagePath = "github.com/gorilla/feeds";
fetch = {
type = "git";
url = "https://github.com/gorilla/feeds";
rev = "v1.1.1";
sha256 = "1lwqibra4hyzx0jhaz12rfhfnw73bmdf8cn9r51nqidk8k7zf7sg";
};
}
{
goPackagePath = "github.com/hpcloud/tail";
fetch = {
type = "git";
url = "https://github.com/hpcloud/tail";
rev = "v1.0.0";
sha256 = "1njpzc0pi1acg5zx9y6vj9xi6ksbsc5d387rd6904hy6rh2m6kn0";
};
}
{
goPackagePath = "github.com/joho/godotenv";
fetch = {
type = "git";
url = "https://github.com/joho/godotenv";
rev = "v1.3.0";
sha256 = "0ri8if0pc3x6jg4c3i8wr58xyfpxkwmcjk3rp8gb398a1aa3gpjm";
};
}
{
goPackagePath = "github.com/json-iterator/go";
fetch = {
type = "git";
url = "https://github.com/json-iterator/go";
rev = "v1.1.9";
sha256 = "0pkn2maymgl9v6vmq9q1si8xr5bbl88n6981y0lx09px6qxb29qx";
};
}
{
goPackagePath = "github.com/julienschmidt/httprouter";
fetch = {
type = "git";
url = "https://github.com/julienschmidt/httprouter";
rev = "v1.2.0";
sha256 = "1k8bylc9s4vpvf5xhqh9h246dl1snxrzzz0614zz88cdh8yzs666";
};
}
{
goPackagePath = "github.com/konsorten/go-windows-terminal-sequences";
fetch = {
type = "git";
url = "https://github.com/konsorten/go-windows-terminal-sequences";
rev = "v1.0.1";
sha256 = "1lchgf27n276vma6iyxa0v1xds68n2g8lih5lavqnx5x6q5pw2ip";
};
}
{
goPackagePath = "github.com/kr/logfmt";
fetch = {
type = "git";
url = "https://github.com/kr/logfmt";
rev = "b84e30acd515";
sha256 = "02ldzxgznrfdzvghfraslhgp19la1fczcbzh7wm2zdc6lmpd1qq9";
};
}
{
goPackagePath = "github.com/kr/pretty";
fetch = {
type = "git";
url = "https://github.com/kr/pretty";
rev = "v0.1.0";
sha256 = "18m4pwg2abd0j9cn5v3k2ksk9ig4vlwxmlw9rrglanziv9l967qp";
};
}
{
goPackagePath = "github.com/kr/pty";
fetch = {
type = "git";
url = "https://github.com/kr/pty";
rev = "v1.1.1";
sha256 = "0383f0mb9kqjvncqrfpidsf8y6ns5zlrc91c6a74xpyxjwvzl2y6";
};
}
{
goPackagePath = "github.com/kr/text";
fetch = {
type = "git";
url = "https://github.com/kr/text";
rev = "v0.1.0";
sha256 = "1gm5bsl01apvc84bw06hasawyqm4q84vx1pm32wr9jnd7a8vjgj1";
};
}
{
goPackagePath = "github.com/leanovate/gopter";
fetch = {
type = "git";
url = "https://github.com/leanovate/gopter";
rev = "634a59d12406";
sha256 = "0rjx9niww7qxiqch6lwq9gibvxi41nm112yg5mzl3hpi084mb94c";
};
}
{
goPackagePath = "github.com/matttproud/golang_protobuf_extensions";
fetch = {
type = "git";
url = "https://github.com/matttproud/golang_protobuf_extensions";
rev = "v1.0.1";
sha256 = "1d0c1isd2lk9pnfq2nk0aih356j30k3h1gi2w0ixsivi5csl7jya";
};
}
{
goPackagePath = "github.com/modern-go/concurrent";
fetch = {
type = "git";
url = "https://github.com/modern-go/concurrent";
rev = "bacd9c7ef1dd";
sha256 = "0s0fxccsyb8icjmiym5k7prcqx36hvgdwl588y0491gi18k5i4zs";
};
}
{
goPackagePath = "github.com/modern-go/reflect2";
fetch = {
type = "git";
url = "https://github.com/modern-go/reflect2";
rev = "v1.0.1";
sha256 = "06a3sablw53n1dqqbr2f53jyksbxdmmk8axaas4yvnhyfi55k4lf";
};
}
{
goPackagePath = "github.com/mwitkow/go-conntrack";
fetch = {
type = "git";
url = "https://github.com/mwitkow/go-conntrack";
rev = "cc309e4a2223";
sha256 = "0nbrnpk7bkmqg9mzwsxlm0y8m7s9qd9phr1q30qlx2qmdmz7c1mf";
};
}
{
goPackagePath = "github.com/mxpv/patreon-go";
fetch = {
type = "git";
url = "https://github.com/mxpv/patreon-go";
rev = "646111f1d983";
sha256 = "0cksf3andl8z04lychay2j0l8wrpdq7j5pdb6zy5yr4990iab6aa";
};
}
{
goPackagePath = "github.com/onsi/ginkgo";
fetch = {
type = "git";
url = "https://github.com/onsi/ginkgo";
rev = "v1.7.0";
sha256 = "14wgpdrvpc35rdz3859bz53sc1g4vpr1fysy15wy3ff9gmqs14yg";
};
}
{
goPackagePath = "github.com/onsi/gomega";
fetch = {
type = "git";
url = "https://github.com/onsi/gomega";
rev = "v1.4.3";
sha256 = "1c8rqg5i2hz3snmq7s41yar1zjnzilb0fyiyhkg83v97afcfx79v";
};
}
{
goPackagePath = "github.com/philandstuff/dhall-golang";
fetch = {
type = "git";
url = "https://github.com/philandstuff/dhall-golang";
rev = "v1.0.0";
sha256 = "1ir3yhjbkqgk1z1q2v6vgbrw4q1n086mi9mbxpjrn2yn09k1h8l1";
};
}
{
goPackagePath = "github.com/pkg/errors";
fetch = {
type = "git";
url = "https://github.com/pkg/errors";
rev = "v0.8.1";
sha256 = "0g5qcb4d4fd96midz0zdk8b9kz8xkzwfa8kr1cliqbg8sxsy5vd1";
};
}
{
goPackagePath = "github.com/pmezard/go-difflib";
fetch = {
type = "git";
url = "https://github.com/pmezard/go-difflib";
rev = "v1.0.0";
sha256 = "0c1cn55m4rypmscgf0rrb88pn58j3ysvc2d0432dp3c6fqg6cnzw";
};
}
{
goPackagePath = "github.com/povilasv/prommod";
fetch = {
type = "git";
url = "https://github.com/povilasv/prommod";
rev = "v0.0.12";
sha256 = "1fcmlrx0hyvwxk67p01avaz3myis3jyamhfwmyx4crgyhdc6pbb7";
};
}
{
goPackagePath = "github.com/prometheus/client_golang";
fetch = {
type = "git";
url = "https://github.com/prometheus/client_golang";
rev = "v1.6.0";
sha256 = "0wwkx69in9dy5kzd3z6rrqf5by8cwl9r7r17fswcpx9rl3g61x1l";
};
}
{
goPackagePath = "github.com/prometheus/client_model";
fetch = {
type = "git";
url = "https://github.com/prometheus/client_model";
rev = "v0.2.0";
sha256 = "0jffnz94d6ff39fr96b5w8i8yk26pwnrfggzz8jhi8k0yihg2c9d";
};
}
{
goPackagePath = "github.com/prometheus/common";
fetch = {
type = "git";
url = "https://github.com/prometheus/common";
rev = "v0.9.1";
sha256 = "12pyywb02p7d30ccm41mwn69qsgqnsgv1w9jlqrrln2f1svnbqch";
};
}
{
goPackagePath = "github.com/prometheus/procfs";
fetch = {
type = "git";
url = "https://github.com/prometheus/procfs";
rev = "v0.0.11";
sha256 = "1msc8bfywsmrgr2ryqjdqwkxiz1ll08r3qgvaka2507z1wpcpj2c";
};
}
{
goPackagePath = "github.com/russross/blackfriday";
fetch = {
type = "git";
url = "https://github.com/russross/blackfriday";
rev = "v2.0.0";
sha256 = "10xh4zak0qbdi15nik2y72c7nn0k6vsc1iawkwx5v38cwp6hzszl";
};
}
{
goPackagePath = "github.com/sebest/xff";
fetch = {
type = "git";
url = "https://github.com/sebest/xff";
rev = "6c115e0ffa35";
sha256 = "0l11d8mc870vxzgi74cc9dqr7kgxjmbfkfi53gc30rsyx877jx4h";
};
}
{
goPackagePath = "github.com/shurcooL/sanitized_anchor_name";
fetch = {
type = "git";
url = "https://github.com/shurcooL/sanitized_anchor_name";
rev = "v1.0.0";
sha256 = "1gv9p2nr46z80dnfjsklc6zxbgk96349sdsxjz05f3z6wb6m5l8f";
};
}
{
goPackagePath = "github.com/sirupsen/logrus";
fetch = {
type = "git";
url = "https://github.com/sirupsen/logrus";
rev = "v1.4.2";
sha256 = "087k2lxrr9p9dh68yw71d05h5g9p5v26zbwd6j7lghinjfaw334x";
};
}
{
goPackagePath = "github.com/snabb/diagio";
fetch = {
type = "git";
url = "https://github.com/snabb/diagio";
rev = "v1.0.0";
sha256 = "0g4swgx30gaq0a0l71qd7c1q3dq6q8xcdnwp8063lrv8vqf3xplg";
};
}
{
goPackagePath = "github.com/snabb/sitemap";
fetch = {
type = "git";
url = "https://github.com/snabb/sitemap";
rev = "v1.0.0";
sha256 = "0mb8r4r7dqqwdi3f9brcsqp469rsn621x9h2ahc601arjiv1zk0c";
};
}
{
goPackagePath = "github.com/stretchr/objx";
fetch = {
type = "git";
url = "https://github.com/stretchr/objx";
rev = "v0.1.1";
sha256 = "0iph0qmpyqg4kwv8jsx6a56a7hhqq8swrazv40ycxk9rzr0s8yls";
};
}
{
goPackagePath = "github.com/stretchr/testify";
fetch = {
type = "git";
url = "https://github.com/stretchr/testify";
rev = "v1.5.1";
sha256 = "09r89m1wy4cjv2nps1ykp00qjpi0531r07q3s34hr7m6njk4srkl";
};
}
{
goPackagePath = "github.com/ugorji/go";
fetch = {
type = "git";
url = "https://github.com/ugorji/go";
rev = "a2c9fa250719";
sha256 = "10l24bp2vj5c99lxlkzm9icja265jmpki813v3s32ibam590virx";
};
}
{
goPackagePath = "golang.org/x/crypto";
fetch = {
type = "git";
url = "https://go.googlesource.com/crypto";
rev = "c2843e01d9a2";
sha256 = "01xgxbj5r79nmisdvpq48zfy8pzaaj90bn6ngd4nf33j9ar1dp8r";
};
}
{
goPackagePath = "golang.org/x/net";
fetch = {
type = "git";
url = "https://go.googlesource.com/net";
rev = "d28f0bde5980";
sha256 = "18xj31h70m7xxb7gc86n9i21w6d7djbjz67zfaljm4jqskz6hxkf";
};
}
{
goPackagePath = "golang.org/x/oauth2";
fetch = {
type = "git";
url = "https://go.googlesource.com/oauth2";
rev = "bf48bf16ab8d";
sha256 = "1sirdib60zwmh93kf9qrx51r8544k1p9rs5mk0797wibz3m4mrdg";
};
}
{
goPackagePath = "golang.org/x/sync";
fetch = {
type = "git";
url = "https://go.googlesource.com/sync";
rev = "cd5d95a43a6e";
sha256 = "1nqkyz2y1qvqcma52ijh02s8aiqmkfb95j08f6zcjhbga3ds6hds";
};
}
{
goPackagePath = "golang.org/x/sys";
fetch = {
type = "git";
url = "https://go.googlesource.com/sys";
rev = "1957bb5e6d1f";
sha256 = "0imqk4l9785rw7ddvywyf8zn7k3ga6f17ky8rmf8wrri7nknr03f";
};
}
{
goPackagePath = "golang.org/x/text";
fetch = {
type = "git";
url = "https://go.googlesource.com/text";
rev = "v0.3.0";
sha256 = "0r6x6zjzhr8ksqlpiwm5gdd7s209kwk5p4lw54xjvz10cs3qlq19";
};
}
{
goPackagePath = "golang.org/x/xerrors";
fetch = {
type = "git";
url = "https://go.googlesource.com/xerrors";
rev = "9bdfabe68543";
sha256 = "1yjfi1bk9xb81lqn85nnm13zz725wazvrx3b50hx19qmwg7a4b0c";
};
}
{
goPackagePath = "google.golang.org/appengine";
fetch = {
type = "git";
url = "https://github.com/golang/appengine";
rev = "v1.4.0";
sha256 = "06zl7w4sxgdq2pl94wy9ncii6h0z3szl4xpqds0sv3b3wbdlhbnn";
};
}
{
goPackagePath = "google.golang.org/protobuf";
fetch = {
type = "git";
url = "https://go.googlesource.com/protobuf";
rev = "v1.21.0";
sha256 = "12bwln8z1lf9105gdp6ip0rx741i4yfz1520gxnp8861lh9wcl63";
};
}
{
goPackagePath = "gopkg.in/alecthomas/kingpin.v2";
fetch = {
type = "git";
url = "https://gopkg.in/alecthomas/kingpin.v2";
rev = "v2.2.6";
sha256 = "0mndnv3hdngr3bxp7yxfd47cas4prv98sqw534mx7vp38gd88n5r";
};
}
{
goPackagePath = "gopkg.in/check.v1";
fetch = {
type = "git";
url = "https://gopkg.in/check.v1";
rev = "41f04d3bba15";
sha256 = "0vfk9czmlxmp6wndq8k17rhnjxal764mxfhrccza7nwlia760pjy";
};
}
{
goPackagePath = "gopkg.in/fsnotify.v1";
fetch = {
type = "git";
url = "https://gopkg.in/fsnotify.v1";
rev = "v1.4.7";
sha256 = "07va9crci0ijlivbb7q57d2rz9h27zgn2fsm60spjsqpdbvyrx4g";
};
}
{
goPackagePath = "gopkg.in/tomb.v1";
fetch = {
type = "git";
url = "https://gopkg.in/tomb.v1";
rev = "dd632973f1e7";
sha256 = "1lqmq1ag7s4b3gc3ddvr792c5xb5k6sfn0cchr3i2s7f1c231zjv";
};
}
{
goPackagePath = "gopkg.in/yaml.v2";
fetch = {
type = "git";
url = "https://gopkg.in/yaml.v2";
rev = "v2.2.8";
sha256 = "1inf7svydzscwv9fcjd2rm61a4xjk6jkswknybmns2n58shimapw";
};
}
{
goPackagePath = "within.website/ln";
fetch = {
type = "git";
url = "https://github.com/Xe/ln";
rev = "v0.9.0";
sha256 = "1djbjwkyqlvf5gy5jvx0z9mm3g56fg2jjmv0ghwzlvwwpx5h338l";
};
}
]

View File

@ -11,6 +11,18 @@
"url": "https://github.com/justinwoo/easy-dhall-nix/archive/735ad924fd829c9bbee0a167e0b2bbbf91e2cad5.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
},
"naersk": {
"branch": "master",
"description": "Build rust crates in Nix. No configuration, no code generation, no IFD. Sandbox friendly.",
"homepage": "",
"owner": "nmattia",
"repo": "naersk",
"rev": "d5a23213d561893cebdf0d251502430334673036",
"sha256": "0ifvqv3vjg80hhgxr7b22i22gh2gxw0gm5iijd9r7y4qd7n2yrcp",
"type": "tarball",
"url": "https://github.com/nmattia/naersk/archive/d5a23213d561893cebdf0d251502430334673036.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
},
"niv": {
"branch": "master",
"description": "Easy dependency management for Nix projects",

View File

@ -1,6 +0,0 @@
#!/bin/sh
set -e
docker build -t xena/site .
exec docker run --rm -itp 5030:5000 -e PORT=5000 xena/site

View File

@ -1,4 +0,0 @@
#!/bin/sh
cd "$(dirname "$0")"/..
./bin/site

10
scripts/release.sh Executable file
View File

@ -0,0 +1,10 @@
#!/usr/bin/env nix-shell
#! nix-shell -p doctl -p kubectl -p curl -i bash
nix-env -if ./nix/dhall-yaml.nix
doctl kubernetes cluster kubeconfig save kubermemes
dhall-to-yaml-ng < ./site.dhall | kubectl apply -n apps -f -
kubectl rollout status -n apps deployment/christinewebsite
kubectl apply -f ./k8s/job.yml
sleep 10
kubectl delete -f ./k8s/job.yml
curl -H "Authorization: $MI_TOKEN" --data "https://christine.website/blog.json" https://mi.within.website/blog/refresh

View File

@ -9,11 +9,16 @@ in with pkgs;
with xepkgs;
mkShell {
buildInputs = [
# Go tools
go
goimports
gopls
vgo2nix
# Rust
cargo
cargo-watch
rls
rustc
rustfmt
# system dependencies
openssl
pkg-config
# kubernetes deployment
dhall
@ -26,5 +31,8 @@ mkShell {
ispell
];
SITE_PREFIX = "devel.";
CLACK_SET = "Ashlynn,Terry Davis,Dennis Ritchie";
RUST_LOG = "info";
GITHUB_SHA = "devel";
}

View File

@ -10,13 +10,32 @@ let image = "xena/christinewebsite:${tag}"
let vars
: List kubernetes.EnvVar.Type
= [ kubernetes.EnvVar::{ name = "PORT", value = Some "5000" } ]
= [ kubernetes.EnvVar::{ name = "PORT", value = Some "3030" }
, kubernetes.EnvVar::{ name = "RUST_LOG", value = Some "info" }
, kubernetes.EnvVar::{
, name = "PATREON_CLIENT_ID"
, value = Some env:PATREON_CLIENT_ID as Text
}
, kubernetes.EnvVar::{
, name = "PATREON_CLIENT_SECRET"
, value = Some env:PATREON_CLIENT_SECRET as Text
}
, kubernetes.EnvVar::{
, name = "PATREON_ACCESS_TOKEN"
, value = Some env:PATREON_ACCESS_TOKEN as Text
}
, kubernetes.EnvVar::{
, name = "PATREON_REFRESH_TOKEN"
, value = Some env:PATREON_REFRESH_TOKEN as Text
}
]
in kms.app.make
kms.app.Config::{
, name = "christinewebsite"
, appPort = 5000
, appPort = 3030
, image = image
, replicas = 2
, domain = "christine.website"
, leIssuer = "prod"
, envVars = vars

View File

@ -1,28 +1,51 @@
{ pkgs ? import (import ./nix/sources.nix).nixpkgs { } }:
{ sources ? import ./nix/sources.nix, pkgs ? import sources.nixpkgs { } }:
with pkgs;
assert lib.versionAtLeast go.version "1.13";
let
srcNoTarget = dir:
builtins.filterSource
(path: type: type != "directory" || builtins.baseNameOf path != "target")
dir;
buildGoPackage rec {
name = "christinewebsite-HEAD";
version = "latest";
goPackagePath = "christine.website";
src = ./.;
goDeps = ./nix/deps.nix;
allowGoReference = false;
naersk = pkgs.callPackage sources.naersk { };
dhallpkgs = import sources.easy-dhall-nix { inherit pkgs; };
src = srcNoTarget ./.;
preBuild = ''
export CGO_ENABLED=0
buildFlagsArray+=(-pkgdir "$TMPDIR")
'';
xesite = naersk.buildPackage {
inherit src;
buildInputs = [ pkg-config openssl git ];
remapPathPrefix = true;
};
postInstall = ''
config = stdenv.mkDerivation {
pname = "xesite-config";
version = "HEAD";
buildInputs = [ dhallpkgs.dhall-simple ];
phases = "installPhase";
installPhase = ''
cd ${src}
dhall resolve < ${src}/config.dhall >> $out
'';
};
in pkgs.stdenv.mkDerivation {
inherit (xesite) name;
inherit src;
phases = "installPhase";
installPhase = ''
mkdir -p $out $out/bin
cp -rf ${config} $out/config.dhall
cp -rf $src/blog $out/blog
cp -rf $src/css $out/css
cp -rf $src/gallery $out/gallery
cp -rf $src/signalboost.dhall $out/signalboost.dhall
cp -rf $src/static $out/static
cp -rf $src/talks $out/talks
cp -rf $src/templates $out/templates
cp -rf ${xesite}/bin/xesite $out/bin/xesite
'';
}

191
src/app.rs Normal file
View File

@ -0,0 +1,191 @@
use crate::{post::Post, signalboost::Person};
use anyhow::Result;
use atom_syndication as atom;
use comrak::{markdown_to_html, ComrakOptions};
use serde::Deserialize;
use std::{fs, path::PathBuf};
#[derive(Clone, Deserialize)]
pub struct Config {
#[serde(rename = "clackSet")]
clack_set: Vec<String>,
signalboost: Vec<Person>,
port: u16,
#[serde(rename = "resumeFname")]
resume_fname: PathBuf,
}
pub fn markdown(inp: &str) -> String {
let mut options = ComrakOptions::default();
options.extension.autolink = true;
options.extension.table = true;
options.extension.description_lists = true;
options.extension.superscript = true;
options.extension.strikethrough = true;
options.extension.footnotes = true;
options.render.unsafe_ = true;
markdown_to_html(inp, &options)
}
async fn patrons() -> Result<Option<patreon::Users>> {
use patreon::*;
let creds: Credentials = envy::prefixed("PATREON_").from_env().unwrap();
let cli = Client::new(creds);
match cli.campaign().await {
Ok(camp) => {
let id = camp.data[0].id.clone();
match cli.pledges(id).await {
Ok(users) => Ok(Some(users)),
Err(why) => {
log::error!("error getting pledges: {:?}", why);
Ok(None)
}
}
}
Err(why) => {
log::error!("error getting patreon campaign: {:?}", why);
Ok(None)
}
}
}
pub const ICON: &'static str = "https://christine.website/static/img/avatar.png";
pub struct State {
pub cfg: Config,
pub signalboost: Vec<Person>,
pub resume: String,
pub blog: Vec<Post>,
pub gallery: Vec<Post>,
pub talks: Vec<Post>,
pub everything: Vec<Post>,
pub jf: jsonfeed::Feed,
pub rf: rss::Channel,
pub af: atom::Feed,
pub sitemap: Vec<u8>,
pub patrons: Option<patreon::Users>,
}
pub async fn init(cfg: PathBuf) -> Result<State> {
let cfg: Config = serde_dhall::from_file(cfg).parse()?;
let sb = cfg.signalboost.clone();
let resume = fs::read_to_string(cfg.resume_fname.clone())?;
let resume: String = markdown(&resume);
let blog = crate::post::load("blog")?;
let gallery = crate::post::load("gallery")?;
let talks = crate::post::load("talks")?;
let mut everything: Vec<Post> = vec![];
{
let blog = blog.clone();
let gallery = gallery.clone();
let talks = talks.clone();
everything.extend(blog.iter().cloned());
everything.extend(gallery.iter().cloned());
everything.extend(talks.iter().cloned());
};
everything.sort();
everything.reverse();
let mut ri: Vec<rss::Item> = vec![];
let mut ai: Vec<atom::Entry> = vec![];
let mut jfb = jsonfeed::Feed::builder()
.title("Christine Dodrill's Blog")
.description("My blog posts and rants about various technology things.")
.author(
jsonfeed::Author::new()
.name("Christine Dodrill")
.url("https://christine.website")
.avatar(ICON),
)
.feed_url("https://christine.website/blog.json")
.user_comment("This is a JSON feed of my blogposts. For more information read: https://jsonfeed.org/version/1")
.home_page_url("https://christine.website")
.icon(ICON)
.favicon(ICON);
for post in &everything {
let post = post.clone();
jfb = jfb.item(post.clone().into());
ri.push(post.clone().into());
ai.push(post.clone().into());
}
let af = {
let mut af = atom::FeedBuilder::default();
af.title("Christine Dodrill's Blog");
af.id("https://christine.website/blog");
af.generator({
let mut generator = atom::Generator::default();
generator.set_value(env!("CARGO_PKG_NAME"));
generator.set_version(env!("CARGO_PKG_VERSION").to_string());
generator.set_uri("https://github.com/Xe/site".to_string());
generator
});
af.entries(ai);
af.build().unwrap()
};
let rf = {
let mut rf = rss::ChannelBuilder::default();
rf.title("Christine Dodrill's Blog");
rf.link("https://christine.website/blog");
rf.generator(crate::APPLICATION_NAME.to_string());
rf.items(ri);
rf.build().unwrap()
};
let mut sm: Vec<u8> = vec![];
let smw = sitemap::writer::SiteMapWriter::new(&mut sm);
let mut urlwriter = smw.start_urlset()?;
for url in &[
"https://christine.website/resume",
"https://christine.website/contact",
"https://christine.website/",
"https://christine.website/blog",
"https://christine.website/signalboost",
] {
urlwriter.url(*url)?;
}
for post in &everything {
urlwriter.url(format!("https://christine.website/{}", post.link))?;
}
urlwriter.end()?;
Ok(State {
cfg: cfg,
signalboost: sb,
resume: resume,
blog: blog,
gallery: gallery,
talks: talks,
everything: everything,
jf: jfb.build(),
af: af,
rf: rf,
sitemap: sm,
patrons: patrons().await?,
})
}
#[cfg(test)]
mod tests {
use anyhow::Result;
#[tokio::test]
async fn init() -> Result<()> {
super::init("./config.dhall".into()).await?;
Ok(())
}
}

11
src/build.rs Normal file
View File

@ -0,0 +1,11 @@
use ructe::{Result, Ructe};
use std::process::Command;
fn main() -> Result<()> {
Ructe::from_env()?.compile_templates("templates")?;
let output = Command::new("git").args(&["rev-parse", "HEAD"]).output().unwrap();
let git_hash = String::from_utf8(output.stdout).unwrap();
println!("cargo:rustc-env=GITHUB_SHA={}", git_hash);
Ok(())
}

77
src/handlers/blog.rs Normal file
View File

@ -0,0 +1,77 @@
use super::{PostNotFound, SeriesNotFound};
use crate::{
app::State,
post::Post,
templates::{self, Html, RenderRucte},
};
use lazy_static::lazy_static;
use prometheus::{IntCounterVec, register_int_counter_vec, opts};
use std::sync::Arc;
use warp::{http::Response, Rejection, Reply};
lazy_static! {
static ref HIT_COUNTER: IntCounterVec =
register_int_counter_vec!(opts!("blogpost_hits", "Number of hits to blogposts"), &["name"])
.unwrap();
}
pub async fn index(state: Arc<State>) -> Result<impl Reply, Rejection> {
let state = state.clone();
Response::builder().html(|o| templates::blogindex_html(o, state.blog.clone()))
}
pub async fn series(state: Arc<State>) -> Result<impl Reply, Rejection> {
let state = state.clone();
let mut series: Vec<String> = vec![];
for post in &state.blog {
if post.front_matter.series.is_some() {
series.push(post.front_matter.series.as_ref().unwrap().clone());
}
}
series.sort();
series.dedup();
Response::builder().html(|o| templates::series_html(o, series))
}
pub async fn series_view(series: String, state: Arc<State>) -> Result<impl Reply, Rejection> {
let state = state.clone();
let mut posts: Vec<Post> = vec![];
for post in &state.blog {
if post.front_matter.series.is_none() {
continue;
}
if post.front_matter.series.as_ref().unwrap() != &series {
continue;
}
posts.push(post.clone());
}
if posts.len() == 0 {
Err(SeriesNotFound(series).into())
} else {
Response::builder().html(|o| templates::series_posts_html(o, series, &posts))
}
}
pub async fn post_view(name: String, state: Arc<State>) -> Result<impl Reply, Rejection> {
let mut want: Option<Post> = None;
for post in &state.blog {
if post.link == format!("blog/{}", name) {
want = Some(post.clone());
}
}
match want {
None => Err(PostNotFound("blog".into(), name).into()),
Some(post) => {
HIT_COUNTER.with_label_values(&[name.clone().as_str()]).inc();
let body = Html(post.body_html.clone());
Response::builder().html(|o| templates::blogpost_html(o, post, body))
}
}
}

73
src/handlers/feeds.rs Normal file
View File

@ -0,0 +1,73 @@
use crate::app::State;
use lazy_static::lazy_static;
use prometheus::{opts, register_int_counter_vec, IntCounterVec};
use std::sync::Arc;
use warp::{http::Response, Rejection, Reply};
lazy_static! {
static ref HIT_COUNTER: IntCounterVec = register_int_counter_vec!(
opts!("feed_hits", "Number of hits to various feeds"),
&["kind"]
)
.unwrap();
}
pub async fn jsonfeed(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["json"]).inc();
let state = state.clone();
Ok(warp::reply::json(&state.jf))
}
#[derive(Debug)]
pub enum RenderError {
WriteAtom(atom_syndication::Error),
WriteRss(rss::Error),
Build(warp::http::Error),
}
impl warp::reject::Reject for RenderError {}
pub async fn atom(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["atom"]).inc();
let state = state.clone();
let mut buf = Vec::new();
state
.af
.write_to(&mut buf)
.map_err(RenderError::WriteAtom)
.map_err(warp::reject::custom)?;
Response::builder()
.status(200)
.header("Content-Type", "application/atom+xml")
.body(buf)
.map_err(RenderError::Build)
.map_err(warp::reject::custom)
}
pub async fn rss(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["rss"]).inc();
let state = state.clone();
let mut buf = Vec::new();
state
.rf
.write_to(&mut buf)
.map_err(RenderError::WriteRss)
.map_err(warp::reject::custom)?;
Response::builder()
.status(200)
.header("Content-Type", "application/rss+xml")
.body(buf)
.map_err(RenderError::Build)
.map_err(warp::reject::custom)
}
pub async fn sitemap(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["sitemap"]).inc();
let state = state.clone();
Response::builder()
.status(200)
.header("Content-Type", "application/xml")
.body(state.sitemap.clone())
.map_err(RenderError::Build)
.map_err(warp::reject::custom)
}

40
src/handlers/gallery.rs Normal file
View File

@ -0,0 +1,40 @@
use super::PostNotFound;
use crate::{
app::State,
post::Post,
templates::{self, Html, RenderRucte},
};
use lazy_static::lazy_static;
use prometheus::{IntCounterVec, register_int_counter_vec, opts};
use std::sync::Arc;
use warp::{http::Response, Rejection, Reply};
lazy_static! {
static ref HIT_COUNTER: IntCounterVec =
register_int_counter_vec!(opts!("gallery_hits", "Number of hits to gallery images"), &["name"])
.unwrap();
}
pub async fn index(state: Arc<State>) -> Result<impl Reply, Rejection> {
let state = state.clone();
Response::builder().html(|o| templates::galleryindex_html(o, state.gallery.clone()))
}
pub async fn post_view(name: String, state: Arc<State>) -> Result<impl Reply, Rejection> {
let mut want: Option<Post> = None;
for post in &state.gallery {
if post.link == format!("gallery/{}", name) {
want = Some(post.clone());
}
}
match want {
None => Err(PostNotFound("gallery".into(), name).into()),
Some(post) => {
HIT_COUNTER.with_label_values(&[name.clone().as_str()]).inc();
let body = Html(post.body_html.clone());
Response::builder().html(|o| templates::gallerypost_html(o, post, body))
}
}
}

145
src/handlers/mod.rs Normal file
View File

@ -0,0 +1,145 @@
use crate::{
app::State,
templates::{self, Html, RenderRucte},
};
use lazy_static::lazy_static;
use prometheus::{opts, register_int_counter_vec, IntCounterVec};
use std::{convert::Infallible, fmt, sync::Arc};
use warp::{
http::{Response, StatusCode},
Rejection, Reply,
};
lazy_static! {
static ref HIT_COUNTER: IntCounterVec =
register_int_counter_vec!(opts!("hits", "Number of hits to various pages"), &["page"])
.unwrap();
}
pub async fn index() -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["index"]).inc();
Response::builder().html(|o| templates::index_html(o))
}
pub async fn contact() -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["contact"]).inc();
Response::builder().html(|o| templates::contact_html(o))
}
pub async fn feeds() -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["feeds"]).inc();
Response::builder().html(|o| templates::feeds_html(o))
}
pub async fn resume(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["resume"]).inc();
let state = state.clone();
Response::builder().html(|o| templates::resume_html(o, Html(state.resume.clone())))
}
pub async fn patrons(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["patrons"]).inc();
let state = state.clone();
match &state.patrons {
None => Response::builder().status(500).html(|o| {
templates::error_html(
o,
"Could not load patrons, let me know the API token expired again".to_string(),
)
}),
Some(patrons) => Response::builder().html(|o| templates::patrons_html(o, patrons.clone())),
}
}
pub async fn signalboost(state: Arc<State>) -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["signalboost"]).inc();
let state = state.clone();
Response::builder().html(|o| templates::signalboost_html(o, state.signalboost.clone()))
}
pub async fn not_found() -> Result<impl Reply, Rejection> {
HIT_COUNTER.with_label_values(&["not_found"]).inc();
Response::builder().html(|o| templates::notfound_html(o, "some path".into()))
}
pub mod blog;
pub mod feeds;
pub mod gallery;
pub mod talks;
#[derive(Debug, thiserror::Error)]
struct PostNotFound(String, String);
impl fmt::Display for PostNotFound {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "not found: {}/{}", self.0, self.1)
}
}
impl warp::reject::Reject for PostNotFound {}
impl From<PostNotFound> for warp::reject::Rejection {
fn from(error: PostNotFound) -> Self {
warp::reject::custom(error)
}
}
#[derive(Debug, thiserror::Error)]
struct SeriesNotFound(String);
impl fmt::Display for SeriesNotFound {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "{}", self.0)
}
}
impl warp::reject::Reject for SeriesNotFound {}
impl From<SeriesNotFound> for warp::reject::Rejection {
fn from(error: SeriesNotFound) -> Self {
warp::reject::custom(error)
}
}
lazy_static! {
static ref REJECTION_COUNTER: IntCounterVec = register_int_counter_vec!(
opts!("rejections", "Number of rejections by kind"),
&["kind"]
)
.unwrap();
}
pub async fn rejection(err: Rejection) -> Result<impl Reply, Infallible> {
let path: String;
let code;
if err.is_not_found() {
REJECTION_COUNTER.with_label_values(&["404"]).inc();
path = "".into();
code = StatusCode::NOT_FOUND;
} else if let Some(SeriesNotFound(series)) = err.find() {
REJECTION_COUNTER
.with_label_values(&["SeriesNotFound"])
.inc();
log::error!("invalid series {}", series);
path = format!("/blog/series/{}", series);
code = StatusCode::NOT_FOUND;
} else if let Some(PostNotFound(kind, name)) = err.find() {
REJECTION_COUNTER.with_label_values(&["PostNotFound"]).inc();
log::error!("unknown post {}/{}", kind, name);
path = format!("/{}/{}", kind, name);
code = StatusCode::NOT_FOUND;
} else {
REJECTION_COUNTER.with_label_values(&["Other"]).inc();
log::error!("unhandled rejection: {:?}", err);
path = format!("weird rejection: {:?}", err);
code = StatusCode::INTERNAL_SERVER_ERROR;
}
Ok(warp::reply::with_status(
Response::builder()
.html(|o| templates::notfound_html(o, path))
.unwrap(),
code,
))
}

40
src/handlers/talks.rs Normal file
View File

@ -0,0 +1,40 @@
use super::PostNotFound;
use crate::{
app::State,
post::Post,
templates::{self, Html, RenderRucte},
};
use lazy_static::lazy_static;
use prometheus::{IntCounterVec, register_int_counter_vec, opts};
use std::sync::Arc;
use warp::{http::Response, Rejection, Reply};
lazy_static! {
static ref HIT_COUNTER: IntCounterVec =
register_int_counter_vec!(opts!("talks_hits", "Number of hits to talks images"), &["name"])
.unwrap();
}
pub async fn index(state: Arc<State>) -> Result<impl Reply, Rejection> {
let state = state.clone();
Response::builder().html(|o| templates::talkindex_html(o, state.talks.clone()))
}
pub async fn post_view(name: String, state: Arc<State>) -> Result<impl Reply, Rejection> {
let mut want: Option<Post> = None;
for post in &state.talks {
if post.link == format!("talks/{}", name) {
want = Some(post.clone());
}
}
match want {
None => Err(PostNotFound("talks".into(), name).into()),
Some(post) => {
HIT_COUNTER.with_label_values(&[name.clone().as_str()]).inc();
let body = Html(post.body_html.clone());
Response::builder().html(|o| templates::talkpost_html(o, post, body))
}
}
}

154
src/main.rs Normal file
View File

@ -0,0 +1,154 @@
use anyhow::Result;
use hyper::{header::CONTENT_TYPE, Body, Response};
use prometheus::{Encoder, TextEncoder};
use std::sync::Arc;
use warp::{path, Filter};
pub mod app;
pub mod handlers;
pub mod post;
pub mod signalboost;
use app::State;
const APPLICATION_NAME: &str = concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION"));
fn with_state(
state: Arc<State>,
) -> impl Filter<Extract = (Arc<State>,), Error = std::convert::Infallible> + Clone {
warp::any().map(move || state.clone())
}
#[tokio::main]
async fn main() -> Result<()> {
let _ = kankyo::init();
pretty_env_logger::init();
log::info!("starting up commit {}", env!("GITHUB_SHA"));
let state = Arc::new(app::init(
std::env::var("CONFIG_FNAME")
.unwrap_or("./config.dhall".into())
.as_str()
.into(),
).await?);
let healthcheck = warp::get().and(warp::path(".within").and(warp::path("health")).map(|| "OK"));
let base = warp::path!("blog" / ..);
let blog_index = base
.and(warp::path::end())
.and(with_state(state.clone()))
.and_then(handlers::blog::index);
let series = base
.and(warp::path!("series").and(with_state(state.clone()).and_then(handlers::blog::series)));
let series_view = base.and(
warp::path!("series" / String)
.and(with_state(state.clone()))
.and(warp::get())
.and_then(handlers::blog::series_view),
);
let post_view = base.and(
warp::path!(String)
.and(with_state(state.clone()))
.and(warp::get())
.and_then(handlers::blog::post_view),
);
let gallery_base = warp::path!("gallery" / ..);
let gallery_index = gallery_base
.and(warp::path::end())
.and(with_state(state.clone()))
.and_then(handlers::gallery::index);
let gallery_post_view = gallery_base.and(
warp::path!(String)
.and(with_state(state.clone()))
.and(warp::get())
.and_then(handlers::gallery::post_view),
);
let talk_base = warp::path!("talks" / ..);
let talk_index = talk_base
.and(warp::path::end())
.and(with_state(state.clone()))
.and_then(handlers::talks::index);
let talk_post_view = talk_base.and(
warp::path!(String)
.and(with_state(state.clone()))
.and(warp::get())
.and_then(handlers::talks::post_view),
);
let index = warp::get().and(path::end().and_then(handlers::index));
let contact = warp::path!("contact").and_then(handlers::contact);
let feeds = warp::path!("feeds").and_then(handlers::feeds);
let resume = warp::path!("resume")
.and(with_state(state.clone()))
.and_then(handlers::resume);
let signalboost = warp::path!("signalboost")
.and(with_state(state.clone()))
.and_then(handlers::signalboost);
let patrons = warp::path!("patrons")
.and(with_state(state.clone()))
.and_then(handlers::patrons);
let files = warp::path("static").and(warp::fs::dir("./static"));
let css = warp::path("css").and(warp::fs::dir("./css"));
let sw = warp::path("sw.js").and(warp::fs::file("./static/js/sw.js"));
let robots = warp::path("robots.txt").and(warp::fs::file("./static/robots.txt"));
let favicon = warp::path("favicon.ico").and(warp::fs::file("./static/favicon/favicon.ico"));
let jsonfeed = warp::path("blog.json")
.and(with_state(state.clone()))
.and_then(handlers::feeds::jsonfeed);
let atom = warp::path("blog.atom")
.and(with_state(state.clone()))
.and_then(handlers::feeds::atom);
let rss = warp::path("blog.rss")
.and(with_state(state.clone()))
.and_then(handlers::feeds::rss);
let sitemap = warp::path("sitemap.xml")
.and(with_state(state.clone()))
.and_then(handlers::feeds::sitemap);
let go_vanity_jsonfeed = warp::path("jsonfeed")
.and(warp::any().map(move || "christine.website/jsonfeed"))
.and(warp::any().map(move || "https://tulpa.dev/Xe/jsonfeed"))
.and_then(go_vanity::gitea);
let metrics_endpoint = warp::path("metrics").and(warp::path::end()).map(move || {
let encoder = TextEncoder::new();
let metric_families = prometheus::gather();
let mut buffer = vec![];
encoder.encode(&metric_families, &mut buffer).unwrap();
Response::builder()
.status(200)
.header(CONTENT_TYPE, encoder.format_type())
.body(Body::from(buffer))
.unwrap()
});
let site = index
.or(contact.or(feeds).or(resume.or(signalboost)).or(patrons))
.or(blog_index.or(series.or(series_view).or(post_view)))
.or(gallery_index.or(gallery_post_view))
.or(talk_index.or(talk_post_view))
.or(jsonfeed.or(atom).or(rss.or(sitemap)))
.or(files.or(css).or(favicon).or(sw.or(robots)))
.or(healthcheck.or(metrics_endpoint).or(go_vanity_jsonfeed))
.map(|reply| {
warp::reply::with_header(
reply,
"X-Hacker",
"If you are reading this, check out /signalboost to find people for your team",
)
})
.map(|reply| warp::reply::with_header(reply, "X-Clacks-Overhead", "GNU Ashlynn"))
.with(warp::log(APPLICATION_NAME))
.recover(handlers::rejection);
warp::serve(site).run(([0, 0, 0, 0], 3030)).await;
Ok(())
}
include!(concat!(env!("OUT_DIR"), "/templates.rs"));

114
src/post/frontmatter.rs Normal file
View File

@ -0,0 +1,114 @@
/// This code was borrowed from @fasterthanlime.
use anyhow::{Result};
use serde::{Serialize, Deserialize};
#[derive(Eq, PartialEq, Deserialize, Default, Debug, Serialize, Clone)]
pub struct Data {
pub title: String,
pub date: String,
pub series: Option<String>,
pub tags: Option<Vec<String>>,
pub slides_link: Option<String>,
pub image: Option<String>,
pub thumb: Option<String>,
pub show: Option<bool>,
}
enum State {
SearchForStart,
ReadingMarker { count: usize, end: bool },
ReadingFrontMatter { buf: String, line_start: bool },
SkipNewline { end: bool },
}
#[derive(Debug, thiserror::Error)]
enum Error {
#[error("EOF while parsing frontmatter")]
EOF,
#[error("Error parsing yaml: {0:?}")]
Yaml(#[from] serde_yaml::Error),
}
impl Data {
pub fn parse(input: &str) -> Result<(Data, usize)> {
let mut state = State::SearchForStart;
let mut payload = None;
let offset;
let mut chars = input.char_indices();
'parse: loop {
let (idx, ch) = match chars.next() {
Some(x) => x,
None => return Err(Error::EOF)?,
};
match &mut state {
State::SearchForStart => match ch {
'-' => {
state = State::ReadingMarker {
count: 1,
end: false,
};
}
'\n' | '\t' | ' ' => {
// ignore whitespace
}
_ => {
panic!("Start of frontmatter not found");
}
},
State::ReadingMarker { count, end } => match ch {
'-' => {
*count += 1;
if *count == 3 {
state = State::SkipNewline { end: *end };
}
}
_ => {
panic!("Malformed frontmatter marker");
}
},
State::SkipNewline { end } => match ch {
'\n' => {
if *end {
offset = idx + 1;
break 'parse;
} else {
state = State::ReadingFrontMatter {
buf: String::new(),
line_start: true,
};
}
}
_ => panic!("Expected newline, got {:?}",),
},
State::ReadingFrontMatter { buf, line_start } => match ch {
'-' if *line_start => {
let mut state_temp = State::ReadingMarker {
count: 1,
end: true,
};
std::mem::swap(&mut state, &mut state_temp);
if let State::ReadingFrontMatter { buf, .. } = state_temp {
payload = Some(buf);
} else {
unreachable!();
}
}
ch => {
buf.push(ch);
*line_start = ch == '\n';
}
},
}
}
// unwrap justification: option set in state machine, Rust can't statically analyze it
let payload = payload.unwrap();
let fm: Self = serde_yaml::from_str(&payload)?;
Ok((fm, offset))
}
}

178
src/post/mod.rs Normal file
View File

@ -0,0 +1,178 @@
use anyhow::{anyhow, Result};
use atom_syndication as atom;
use chrono::prelude::*;
use glob::glob;
use std::{cmp::Ordering, fs};
pub mod frontmatter;
#[derive(Eq, PartialEq, Debug, Clone)]
pub struct Post {
pub front_matter: frontmatter::Data,
pub link: String,
pub body: String,
pub body_html: String,
pub date: DateTime<FixedOffset>,
}
impl Into<jsonfeed::Item> for Post {
fn into(self) -> jsonfeed::Item {
let mut result = jsonfeed::Item::builder()
.title(self.front_matter.title)
.content_html(self.body_html)
.content_text(self.body)
.id(format!("https://christine.website/{}", self.link))
.url(format!("https://christine.website/{}", self.link))
.date_published(self.date.to_rfc3339())
.author(
jsonfeed::Author::new()
.name("Christine Dodrill")
.url("https://christine.website")
.avatar("https://christine.website/static/img/avatar.png"),
);
let mut tags: Vec<String> = vec![];
if let Some(series) = self.front_matter.series {
tags.push(series);
}
if let Some(mut meta_tags) = self.front_matter.tags {
tags.append(&mut meta_tags);
}
if tags.len() != 0 {
result = result.tags(tags);
}
if let Some(image_url) = self.front_matter.image {
result = result.image(image_url);
}
result.build().unwrap()
}
}
impl Into<atom::Entry> for Post {
fn into(self) -> atom::Entry {
let mut content = atom::ContentBuilder::default();
content.src(format!("https://christine.website/{}", self.link));
content.content_type(Some("text/html;charset=utf-8".into()));
content.value(Some(xml::escape::escape_str_pcdata(&self.body_html).into()));
let content = content.build().unwrap();
let mut result = atom::EntryBuilder::default();
result.id(format!("https://christine.website/{}", self.link));
result.contributors({
let mut me = atom::Person::default();
me.set_name("Christine Dodrill");
me.set_email("me@christine.website".to_string());
me.set_uri("https://christine.website".to_string());
vec![me]
});
result.title(self.front_matter.title);
let mut link = atom::Link::default();
link.href = format!("https://christine.website/{}", self.link);
result.links(vec![link]);
result.content(content);
result.published(self.date);
result.build().unwrap()
}
}
impl Into<rss::Item> for Post {
fn into(self) -> rss::Item {
let mut guid = rss::Guid::default();
guid.set_value(format!("https://christine.website/{}", self.link));
let mut result = rss::ItemBuilder::default();
result.title(Some(self.front_matter.title));
result.link(format!("https://christine.website/{}", self.link));
result.guid(guid);
result.author(Some("me@christine.website (Christine Dodrill)".to_string()));
result.content(self.body_html);
result.pub_date(self.date.to_rfc2822());
result.build().unwrap()
}
}
impl Ord for Post {
fn cmp(&self, other: &Self) -> Ordering {
self.partial_cmp(&other).unwrap()
}
}
impl PartialOrd for Post {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
Some(self.date.cmp(&other.date))
}
}
impl Post {
pub fn detri(&self) -> String {
self.date.format("M%m %d %Y").to_string()
}
}
pub fn load(dir: &str) -> Result<Vec<Post>> {
let mut result: Vec<Post> = vec![];
for path in glob(&format!("{}/*.markdown", dir))?.filter_map(Result::ok) {
let body = fs::read_to_string(path.clone())?;
let (fm, content_offset) = frontmatter::Data::parse(body.clone().as_str())?;
let markup = &body[content_offset..];
let date = NaiveDate::parse_from_str(&fm.clone().date, "%Y-%m-%d")?;
result.push(Post {
front_matter: fm,
link: format!("{}/{}", dir, path.file_stem().unwrap().to_str().unwrap()),
body: markup.to_string(),
body_html: crate::app::markdown(&markup),
date: {
DateTime::<Utc>::from_utc(
NaiveDateTime::new(date, NaiveTime::from_hms(0, 0, 0)),
Utc,
)
.with_timezone(&Utc)
.into()
},
})
}
if result.len() == 0 {
Err(anyhow!("no posts loaded"))
} else {
result.sort();
result.reverse();
Ok(result)
}
}
#[cfg(test)]
mod tests {
use super::*;
use anyhow::Result;
#[test]
fn blog() -> Result<()> {
load("blog")?;
Ok(())
}
#[test]
fn gallery() -> Result<()> {
load("gallery")?;
Ok(())
}
#[test]
fn talks() -> Result<()> {
load("talks")?;
Ok(())
}
}

23
src/signalboost.rs Normal file
View File

@ -0,0 +1,23 @@
use serde::Deserialize;
#[derive(Clone, Debug, Deserialize)]
pub struct Person {
pub name: String,
pub tags: Vec<String>,
#[serde(rename = "gitLink")]
pub git_link: String,
pub twitter: String,
}
#[cfg(test)]
mod tests {
use anyhow::Result;
#[test]
fn load() -> Result<()> {
let _people: Vec<super::Person> = serde_dhall::from_file("./signalboost.dhall").parse()?;
Ok(())
}
}

7
static/js/installsw.js Normal file
View File

@ -0,0 +1,7 @@
if (navigator.serviceWorker.controller) {
console.log("Active service worker found, no need to register");
} else {
navigator.serviceWorker.register("/sw.js").then(function(reg) {
console.log("Service worker has been registered for scope:" + reg.scope);
});
}

View File

@ -1,2 +0,0 @@
/*! instant.page v3.0.0 - (C) 2019 Alexandre Dieulot - https://instant.page/license */
let t,e;const n=new Set,o=document.createElement("link"),s=o.relList&&o.relList.supports&&o.relList.supports("prefetch")&&window.IntersectionObserver&&"isIntersecting"in IntersectionObserverEntry.prototype,i="instantAllowQueryString"in document.body.dataset,r="instantAllowExternalLinks"in document.body.dataset,a="instantWhitelist"in document.body.dataset;let c=65,d=!1,l=!1,u=!1;if("instantIntensity"in document.body.dataset){const t=document.body.dataset.instantIntensity;if("mousedown"==t.substr(0,"mousedown".length))d=!0,"mousedown-only"==t&&(l=!0);else if("viewport"==t.substr(0,"viewport".length))navigator.connection&&(navigator.connection.saveData||navigator.connection.effectiveType.includes("2g"))||("viewport"==t?document.documentElement.clientWidth*document.documentElement.clientHeight<45e4&&(u=!0):"viewport-all"==t&&(u=!0));else{const e=parseInt(t);isNaN(e)||(c=e)}}if(s){const n={capture:!0,passive:!0};if(l||document.addEventListener("touchstart",function(t){e=performance.now();const n=t.target.closest("a");if(!f(n))return;h(n.href)},n),d?document.addEventListener("mousedown",function(t){const e=t.target.closest("a");if(!f(e))return;h(e.href)},n):document.addEventListener("mouseover",function(n){if(performance.now()-e<1100)return;const o=n.target.closest("a");if(!f(o))return;o.addEventListener("mouseout",m,{passive:!0}),t=setTimeout(()=>{h(o.href),t=void 0},c)},n),u){let t;(t=window.requestIdleCallback?t=>{requestIdleCallback(t,{timeout:1500})}:t=>{t()})(()=>{const t=new IntersectionObserver(e=>{e.forEach(e=>{if(e.isIntersecting){const n=e.target;t.unobserve(n),h(n.href)}})});document.querySelectorAll("a").forEach(e=>{f(e)&&t.observe(e)})})}}function m(e){e.relatedTarget&&e.target.closest("a")==e.relatedTarget.closest("a")||t&&(clearTimeout(t),t=void 0)}function f(t){if(t&&t.href&&(!a||"instant"in t.dataset)&&(r||t.origin==location.origin||"instant"in t.dataset)&&["http:","https:"].includes(t.protocol)&&("http:"!=t.protocol||"https:"!=location.protocol)&&(i||!t.search||"instant"in t.dataset)&&!(t.hash&&t.pathname+t.search==location.pathname+location.search||"noInstant"in t.dataset))return!0}function h(t){if(n.has(t))return;const e=document.createElement("link");e.rel="prefetch",e.href=t,document.head.appendChild(e),n.add(t)}

View File

@ -5,37 +5,37 @@ self.addEventListener('install', function(event) {
event.waitUntil(preLoad());
});
const cacheName = "cache-2019-11-01";
const cacheName = "cache-xesite-2.0.0";
var preLoad = function(){
console.log('[PWA Builder] Install Event processing');
return caches.open(cacheName).then(function(cache) {
console.log('[PWA Builder] Cached index and offline page during Install');
return cache.addAll(['/blog/', '/blog', '/', '/contact', '/resume', '/talks', '/gallery']);
});
console.log('[PWA Builder] Install Event processing');
return caches.open(cacheName).then(function(cache) {
console.log('[PWA Builder] Cached index and offline page during Install');
return cache.addAll(['/blog/', '/blog', '/', '/contact', '/resume', '/talks', '/gallery', '/signalboost']);
});
};
self.addEventListener('fetch', function(event) {
if (event.request.cache === 'only-if-cached' && event.request.mode !== 'same-origin') {
return;
}
console.log('[PWA Builder] The service worker is serving the asset.');
event.respondWith(checkResponse(event.request).catch(function() {
return returnFromCache(event.request);
}));
event.waitUntil(addToCache(event.request));
if (event.request.cache === 'only-if-cached' && event.request.mode !== 'same-origin') {
return;
}
console.log('[PWA Builder] The service worker is serving the asset.');
event.respondWith(checkResponse(event.request).catch(function() {
return returnFromCache(event.request);
}));
event.waitUntil(addToCache(event.request));
});
var checkResponse = function(request){
return new Promise(function(fulfill, reject) {
fetch(request).then(function(response){
if(response.status !== 404) {
fulfill(response);
} else {
reject();
}
}, reject);
});
return new Promise(function(fulfill, reject) {
fetch(request).then(function(response){
if(response.status !== 404) {
fulfill(response);
} else {
reject();
}
}, reject);
});
};
var addToCache = function(request){

View File

@ -1,8 +1,10 @@
{{ define "title" }}
<title>Blog - Christine Dodrill</title>
{{ end }}
@use crate::post::Post;
@use super::{header_html, footer_html};
@(posts: Vec<Post>)
@:header_html(Some("Blog"), None)
{{ define "content" }}
<h1>Blogposts</h1>
<p>If you have a compatible reader, be sure to check out my <a href="/blog.rss">RSS Feed</a> for automatic updates. Also check out the <a href="/blog.json">JSONFeed</a>.</p>
@ -11,9 +13,9 @@
<p>
<ul>
{{ range . }}
<li>{{ .DateString }} - <a href="{{ .Link }}">{{ .Title }}</a></li>
{{ end }}
@for post in posts {
<li>@post.date.format("%Y-%m-%d") - <a href="@post.link">@post.front_matter.title</a></li>
}
</ul>
</p>
@ -89,4 +91,4 @@
</blockquote>
</p>
{{ end }}
@:footer_html()

122
templates/blogpost.rs.html Normal file
View File

@ -0,0 +1,122 @@
@use super::{header_html, footer_html};
@use crate::post::Post;
@(post: Post, body: impl ToHtml)
@:header_html(Some(&post.front_matter.title.clone()), None)
<!-- Twitter -->
<meta name="twitter:card" content="summary" />
<meta name="twitter:site" content="@@theprincessxena" />
<meta name="twitter:title" content="@post.front_matter.title" />
<meta name="twitter:description" content="Posted on @post.date" />
<!-- Facebook -->
<meta property="og:type" content="website" />
<meta property="og:title" content="@post.front_matter.title" />
<meta property="og:site_name" content="Christine Dodrill's Blog" />
<!-- Description -->
<meta name="description" content="@post.front_matter.title - Christine Dodrill's Blog" />
<meta name="author" content="Christine Dodrill">
<link rel="canonical" href="https://christine.website/@post.link">
<script type="application/ld+json">
@{
"@@context": "http://schema.org",
"@@type": "Article",
"headline": "@post.front_matter.title",
"image": "https://christine.website/static/img/avatar.png",
"url": "https://christine.website/@post.link",
"datePublished": "@post.date",
"mainEntityOfPage": @{
"@@type": "WebPage",
"@@id": "https://christine.website/@post.link"
@},
"author": @{
"@@type": "Person",
"name": "Christine Dodrill"
@},
"publisher": @{
"@@type": "Person",
"name": "Christine Dodrill"
@}
@}
</script>
@body
<hr />
<!-- The button that should be clicked. -->
<button onclick="share_on_mastodon()">Share on Mastodon</button>
<p>This article was posted on @post.detri(). Facts and circumstances may have changed since publication. Please <a href="/contact">contact me</a> before jumping to conclusions if something seems wrong or unclear.</p>
@if post.front_matter.series.is_some() {
<p>Series: <a href="/blog/series/@post.front_matter.series.as_ref().unwrap()">@post.front_matter.series.as_ref().unwrap()</a></p>
}
@if post.front_matter.tags.is_some() {
<p>Tags: @for tag in post.front_matter.tags.as_ref().unwrap() { <code>@tag</code> }</p>
}
<script>
// The actual function. Set this as an onclick function for your "Share on Mastodon" button
function share_on_mastodon() @{
// Prefill the form with the user's previously-specified Mastodon instance, if applicable
var default_url = localStorage['mastodon_instance'];
// If there is no cached instance/domain, then insert a "https://" with no domain at the start of the prompt.
if (!default_url)
default_url = "https://";
var instance = prompt("Enter your instance's address: (ex: https://linuxrocks.online)", default_url);
if (instance) @{
// Handle URL formats
if ( !instance.startsWith("https://") && !instance.startsWith("http://") )
instance = "https://" + instance;
// get the current page's url
var url = window.location.href;
// get the page title from the og:title meta tag, if it exists.
var title = document.querySelectorAll('meta[property="og:title"]')[0].getAttribute("content");
// Otherwise, use the <title> tag as the title
if (!title) var title = document.getElementsByTagName("title")[0].innerHTML;
// Handle slash
if ( !instance.endsWith("/") )
instance = instance + "/";
// Cache the instance/domain for future requests
localStorage['mastodon_instance'] = instance;
// Hashtags
var hashtags = "#blogpost";
@if post.front_matter.series.is_some() {
hashtags += "#@post.front_matter.series.as_ref().unwrap()";
}
@if post.front_matter.tags.is_some() {
hashtags += "@for tag in post.front_matter.tags.as_ref().unwrap() { #@tag }";
}
// Tagging users, such as offical accounts or the author of the post
var author = "@@cadey@@mst3k.interlinked.me";
// Create the Share URL
// https://someinstance.tld/share?text=URL%20encoded%20text
mastodon_url = instance + "share?text=" + encodeURIComponent(title + "\n\n" + url + "\n\n" + hashtags + " " + author);
// Open a new window at the share location
window.open(mastodon_url, '_blank');
@}
@}
</script>
@:footer_html()

View File

@ -1,11 +1,14 @@
{{ define "title" }}<title>Contact - Christine Dodrill</title>{{ end }}
@use super::{header_html, footer_html};
@()
@:header_html(Some("Contact"), None)
{{ define "content" }}
<h1>Contact Information</h1>
<div class="grid">
<div class="cell -6of12">
<h3>Email</h3>
<p>me@christine.website</p>
<p>me@@christine.website</p>
<p>My GPG fingerprint is <code>799F 9134 8118 1111</code>. If you get an email that appears to be from me and the signature does not match that fingerprint, it is not from me. You may download a copy of my public key <a href="/static/gpg.pub">here</a>.</p>
@ -14,11 +17,10 @@
<li><a href="https://github.com/Xe">Github</a></li>
<li><a href="https://twitter.com/theprincessxena">Twitter</a></li>
<li><a href="https://keybase.io/xena">Keybase</a></li>
<li><a href="https://www.coinbase.com/christinedodrill">Coinbase</a></li>
<li><a href="https://ko-fi.com/A265JE0">Ko-fi</a></li>
<li><a href="https://www.patreon.com/cadey">Patreon</a></li>
<li><a href="https://www.facebook.com/chrissycade1337">Facebook</a></li>
<li><a href="https://mst3k.interlinked.me/@cadey">@cadey@mst3k.interlinked.me</a></li>
<li><a href="https://mst3k.interlinked.me/@@cadey">@@cadey@@mst3k.interlinked.me</a></li>
<li>Fortnite: Within Reason</li>
</ul>
</div>
@ -27,10 +29,11 @@
<p>I have a <a href="https://www.patreon.com/cadey">Patreon</a> if you want to send donations, otherwise my <a href="https://ko-fi.com/A265JE0">Ko-Fi</a> works too.</p>
<h4>Telegram</h4>
<p><a href="https://t.me/miamorecadenza">@miamorecadenza</a></p>
<p><a href="https://t.me/miamorecadenza">@@miamorecadenza</a></p>
<h4>Discord</h4>
<p><code>Cadey~#1337</code></p>
</div>
</div>
{{ end }}
@:footer_html()

View File

@ -1,9 +0,0 @@
{{ define "title" }}
<title>Error - Christine Dodrill</title>
{{ end }}
{{ define "content" }}
<pre>
{{ . }}
</pre>
{{ end }}

13
templates/error.rs.html Normal file
View File

@ -0,0 +1,13 @@
@use super::{header_html, footer_html};
@(why: String)
@:header_html(Some("Error"), None)
<h1>Error</h1>
<code><pre>@why</pre></code>
<p>You could try to <a href="/">go home</a> or <a href="https://github.com/Xe/site/issues/new">report this issue</a> so it can be fixed.</p>
@:footer_html()

View File

@ -1,8 +1,9 @@
{{ define "title" }}
<title>Feeds - Christine Dodrill</title>
{{ end }}
@use super::{header_html, footer_html};
@()
@:header_html(Some("Feeds"), None)
{{ define "content" }}
<h1>Feeds</h1>
<ul>
@ -11,4 +12,4 @@
<li>Mastodon: <a href="https://mst3k.interlinked.me/users/cadey.rss">RSS</a></li>
</ul>
{{ end }}
@:footer_html()

17
templates/footer.rs.html Normal file
View File

@ -0,0 +1,17 @@
@use crate::APPLICATION_NAME as APP;
@()
</div>
<hr />
<footer>
<blockquote>Copyright 2020 Christine Dodrill. Any and all opinions listed here are my own and not representative of my employers; future, past and present.</blockquote>
<!--<p>Like what you see? Donate on <a href="https://www.patreon.com/cadey">Patreon</a> like <a href="/patrons">these awesome people</a>!</p>-->
<p>Looking for someone for your team? Take a look <a href="/signalboost">here</a>.</p>
<p>Served by @APP running commit <a href="https://github.com/Xe/site/commit/@env!("GITHUB_SHA")">@env!("GITHUB_SHA")</a>, see <a href="https://github.com/Xe/site">source code here</a>.</p>
</footer>
</div>
<script src="/static/js/installsw.js" defer></script>
</body>
</html>

View File

@ -1,26 +0,0 @@
{{ define "title" }}
<title>Gallery - Christine Dodrill</title>
<meta name="furbooru-validation" value="FUR-LINKVALIDATION-CD28668CBF" />
{{ end }}
{{ define "content" }}
<h1>Gallery</h1>
<p>Here are links to all of the art I have done in the last few years.</p>
<p>If you have a compatible reader, be sure to check out my <a href="/blog.rss">RSS Feed</a> for automatic updates. Also check out the <a href="/blog.json">JSONFeed</a>.</p>
<p>
<div class="grid">
{{ range . }}
<div class="card cell -4of12 blogpost-card">
<header class="card-header">{{ .Title }}</header>
<div class="card-content">
<center><p>Posted on {{ .DateString }} <br><a href="{{ .Link }}"><img src="{{ .ThumbURL }}" /></a></p></center>
</div>
</div>
{{ end }}
</div>
</p>
{{ end }}

View File

@ -0,0 +1,23 @@
@use crate::post::Post;
@use super::{header_html, footer_html};
@(posts: Vec<Post>)
@:header_html(Some("Gallery"), None)
<h1>Gallery</h1>
<p>Here are links to a lot of the art I have done in the last few years.</p>
<div class="grid">
@for post in posts {
<div class="card cell -4of12 blogpost-card">
<header class="card-header">@post.front_matter.title</header>
<div class="card-content">
<center><p>Posted on @post.date.format("%Y-%m-%d")<br /><a href="@post.link"><img src="@post.front_matter.thumb.as_ref().unwrap()" /></a></p></center>
</div>
</div>
}
</div>
@:footer_html()

View File

@ -1,116 +0,0 @@
{{ define "title" }}
<title>{{ .Title }} - Christine Dodrill</title>
<!-- Twitter -->
<meta name="twitter:card" content="summary" />
<meta name="twitter:site" content="@theprincessxena" />
<meta name="twitter:title" content="{{ .Title }}" />
<meta name="twitter:description" content="Posted on {{ .Date }}" />
<!-- Facebook -->
<meta property="og:type" content="website" />
<meta property="og:title" content="{{ .Title }}" />
<meta property="og:site_name" content="Talk by Christine Dodrill" />
<!-- Description -->
<meta name="description" content="{{ .Title }} - Talk by Christine Dodrill" />
<meta name="author" content="Christine Dodrill">
<link rel="canonical" href="https://christine.website/{{ .Link }}">
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "Painting",
"headline": "{{ .Title }}",
"image": "https://christine.website{{ .Image }}",
"url": "https://christine.website/{{ .Link }}",
"datePublished": "{{ .Date }}",
"mainEntityOfPage": {
"@type": "",
"@id": "https://christine.website{{ .Image }}"
},
"creator": {
"@type": "Person",
"name": "Christine Dodrill"
},
"publisher": {
"@type": "Person",
"name": "Christine Dodrill"
}
}
</script>
{{ end }}
{{ define "content" }}
<h1>{{ .Title }}</h1>
{{ .BodyHTML }}
<center>
<img src="{{ .Image }}" />
</center>
<hr />
<!-- The button that should be clicked. -->
<button onclick="share_on_mastodon()">Share on Mastodon</button>
<p>This artwork was posted on {{ .Date }}.</p>
{{ if ne .Tags "" }}
<p>Tags:{{.Tags}}</p>
{{ end }}
<script>
// The actual function. Set this as an onclick function for your "Share on Mastodon" button
function share_on_mastodon() {
// Prefill the form with the user's previously-specified Mastodon instance, if applicable
var default_url = localStorage['mastodon_instance'];
// If there is no cached instance/domain, then insert a "https://" with no domain at the start of the prompt.
if (!default_url)
default_url = "https://";
var instance = prompt("Enter your instance's address: (ex: https://linuxrocks.online)", default_url);
if (instance) {
// Handle URL formats
if ( !instance.startsWith("https://") && !instance.startsWith("http://") )
instance = "https://" + instance;
// Get the current page's URL
var url = window.location.href;
// Get the page title from the og:title meta tag, if it exists.
var title = document.querySelectorAll('meta[property="og:title"]')[0].getAttribute("content");
// Otherwise, use the <title> tag as the title
if (!title) var title = document.getElementsByTagName("title")[0].innerHTML;
// Handle slash
if ( !instance.endsWith("/") )
instance = instance + "/";
// Cache the instance/domain for future requests
localStorage['mastodon_instance'] = instance;
// Hashtags
var hashtags = "#art";
{{ if ne .Tags "" }}hashtags += " {{ .Tags }}";{{ end }}
// Tagging users, such as offical accounts or the author of the post
var author = "@cadey@mst3k.interlinked.me";
// Create the Share URL
// https://someinstance.tld/share?text=URL%20encoded%20text
mastodon_url = instance + "share?text=" + encodeURIComponent(title + "\n\n" + url + "\n\n" + hashtags + " " + author);
// Open a new window at the share location
window.open(mastodon_url, '_blank');
}
}
</script>
{{ end }}

View File

@ -1,69 +1,77 @@
{{ define "title" }}
<title>{{ .Title }} - Christine Dodrill</title>
@use super::{header_html, footer_html};
@use crate::post::Post;
@(post: Post, body: impl ToHtml)
@:header_html(Some(&post.front_matter.title.clone()), None)
<!-- Twitter -->
<meta name="twitter:card" content="summary" />
<meta name="twitter:site" content="@theprincessxena" />
<meta name="twitter:title" content="{{ .Title }}" />
<meta name="twitter:description" content="Posted on {{ .Date }}" />
<meta name="twitter:site" content="@@theprincessxena" />
<meta name="twitter:title" content="@post.front_matter.title" />
<meta name="twitter:description" content="Posted on @post.date" />
<!-- Facebook -->
<meta property="og:type" content="website" />
<meta property="og:title" content="{{ .Title }}" />
<meta property="og:title" content="@post.front_matter.title" />
<meta property="og:site_name" content="Christine Dodrill's Blog" />
<!-- Description -->
<meta name="description" content="{{ .Title }} - Christine Dodrill's Blog" />
<meta name="description" content="@post.front_matter.title - Christine Dodrill's Blog" />
<meta name="author" content="Christine Dodrill">
<link rel="canonical" href="https://christine.website/{{ .Link }}">
<link rel="canonical" href="https://christine.website/@post.link">
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "Article",
"headline": "{{ .Title }}",
@{
"@@context": "http://schema.org",
"@@type": "Article",
"headline": "@post.front_matter.title",
"image": "https://christine.website/static/img/avatar.png",
"url": "https://christine.website/{{ .Link }}",
"datePublished": "{{ .Date }}",
"mainEntityOfPage": {
"@type": "WebPage",
"@id": "https://christine.website/{{ .Link }}"
},
"author": {
"@type": "Person",
"url": "https://christine.website/@post.link",
"datePublished": "@post.date",
"mainEntityOfPage": @{
"@@type": "WebPage",
"@@id": "https://christine.website/@post.link"
@},
"author": @{
"@@type": "Person",
"name": "Christine Dodrill"
},
"publisher": {
"@type": "Person",
@},
"publisher": @{
"@@type": "Person",
"name": "Christine Dodrill"
}
}
@}
@}
</script>
{{ end }}
{{ define "content" }}
{{ .BodyHTML }}
<h1>@post.front_matter.title</h1>
@body
<center>
<img src="@post.front_matter.image.as_ref().unwrap()" />
</center>
<hr />
<!-- The button that should be clicked. -->
<button onclick="share_on_mastodon()">Share on Mastodon</button>
<p>This article was posted on {{ .Date }}. Facts and circumstances may have changed since publication. Please <a href="/contact">contact me</a> before jumping to conclusions if something seems wrong or unclear.</p>
<p>This artwork was posted on @post.detri().</p>
{{ if ne .Series "" }}
<p>Series: <a href="/blog/series/{{ .Series }}">{{ .Series }}</a></p>
{{ end }}
@if post.front_matter.series.is_some() {
<p>Series: <a href="/blog/series/@post.front_matter.series.as_ref().unwrap()">@post.front_matter.series.as_ref().unwrap()</a></p>
}
{{ if ne .Tags "" }}
<p>Tags:{{.Tags}}</p>
{{ end }}
@if post.front_matter.tags.is_some() {
<p>Tags: @for tag in post.front_matter.tags.as_ref().unwrap() { <code>@tag</code> }</p>
}
<script>
// The actual function. Set this as an onclick function for your "Share on Mastodon" button
function share_on_mastodon() {
function share_on_mastodon() @{
// Prefill the form with the user's previously-specified Mastodon instance, if applicable
var default_url = localStorage['mastodon_instance'];
@ -72,7 +80,7 @@ function share_on_mastodon() {
default_url = "https://";
var instance = prompt("Enter your instance's address: (ex: https://linuxrocks.online)", default_url);
if (instance) {
if (instance) @{
// Handle URL formats
if ( !instance.startsWith("https://") && !instance.startsWith("http://") )
instance = "https://" + instance;
@ -94,13 +102,14 @@ function share_on_mastodon() {
localStorage['mastodon_instance'] = instance;
// Hashtags
var hashtags = "#blogpost";
var hashtags = "#art";
{{ if ne .SeriesTag "" }}hashtags += " #{{ .SeriesTag }}";{{ end }}
{{ if ne .Tags "" }}hashtags += "{{ .Tags }}";{{ end }}
@if post.front_matter.tags.is_some() {
hashtags += "@for tag in post.front_matter.tags.as_ref().unwrap() { #@tag}";
}
// Tagging users, such as offical accounts or the author of the post
var author = "@cadey@mst3k.interlinked.me";
var author = "@@cadey@@mst3k.interlinked.me";
// Create the Share URL
// https://someinstance.tld/share?text=URL%20encoded%20text
@ -108,8 +117,8 @@ function share_on_mastodon() {
// Open a new window at the share location
window.open(mastodon_url, '_blank');
}
}
@}
@}
</script>
{{ end }}
@:footer_html()

View File

@ -1,17 +1,24 @@
@use chrono::{Datelike, Utc};
@(title: Option<&str>, styles: Option<&str>)
<!DOCTYPE html>
<html lang="en">
<head>
{{ template "title" . }}
@if title.is_some() {
<title>@title.unwrap() - Christine Dodrill</title>
} else {
<title>Christine Dodrill</title>
}
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="go-import" content="christine.website git https://github.com/Xe/site">
<link rel="stylesheet" href="/css/hack.css" />
<link rel="stylesheet" href="/css/gruvbox-dark.css" />
<!-- <link rel="stylesheet" href="/css/snow.css" /> -->
<link rel="stylesheet" href="/css/shim.css" />
@if Utc::now().month() == 12 { <link rel="stylesheet" href="/css/snow.css" /> }
<link rel="manifest" href="/static/manifest.json" />
<link rel="alternate" type="application/rss+xml" href="https://christine.website/blog.rss" />
<link rel="alternate" type="application/atom+xml" href="https://christine.website/blog.atom" />
<link rel="alternate" title="My Feed" type="application/json" href="https://christine.website/blog.json" />
<link rel="alternate" title="Christine Dodrill's Blog" type="application/rss+xml" href="https://christine.website/blog.rss" />
<link rel="alternate" title="Christine Dodrill's Blog" type="application/json" href="https://christine.website/blog.json" />
<link rel="apple-touch-icon" sizes="57x57" href="/static/favicon/apple-icon-57x57.png">
<link rel="apple-touch-icon" sizes="60x60" href="/static/favicon/apple-icon-60x60.png">
@ -30,62 +37,16 @@
<meta name="msapplication-TileColor" content="#ffffff">
<meta name="msapplication-TileImage" content="/static/favicon/ms-icon-144x144.png">
<meta name="theme-color" content="#ffffff">
<style>
.main {
padding: 20px 10px;
}
.hack h1 {
padding-top: 0;
}
footer.footer {
border-top: 1px solid #ccc;
margin-top: 80px;
margin-top: 5rem;
padding: 48px 0;
padding: 3rem 0;
}
img {
max-width: 100%;
padding: 1em;
}
</style>
{{ template "styles" . }}
@if styles.is_some() {
<style>
@styles.unwrap()
</style>
}
</head>
<body class="snow hack gruvbox-dark">
{{ template "scripts" . }}
<div class="container">
<header>
<p><a href="/">Christine Dodrill</a> - <a href="/blog">Blog</a> - <a href="/contact">Contact</a> - <a href="/gallery">Gallery</a> - <a href="/resume">Resume</a> - <a href="/talks">Talks</a> - <a href="/signalboost">Signal Boost</a> - <a href="/feeds">Feeds</a> | <a target="_blank" rel="noopener noreferrer" href="https://graphviz.christine.website">GraphViz</a> - <a target="_blank" rel="noopener noreferrer" href="https://when-then-zen.christine.website/">When Then Zen</a></p>
</header>
<div class="snowframe">
{{ template "content" . }}
</div>
<footer>
<blockquote>Copyright 2020 Christine Dodrill. Any and all opinions listed here are my own and not representative of my employers; future, past and present.</blockquote>
<br />
{{/* <p>Like what you see? Donate on <a href="https://www.patreon.com/cadey">Patreon</a> like <a href="/patrons">these awesome people</a>!</p> */}}
<p>Looking for someone for your team? Take a look <a href="/signalboost">here</a>.</p>
</footer>
<script>
if (navigator.serviceWorker.controller) {
console.log("Active service worker found, no need to register");
} else {
navigator.serviceWorker.register("/sw.js").then(function(reg) {
console.log("Service worker has been registered for scope:" + reg.scope);
});
}
</script>
</div>
<script src="/static/js/instantpage-3.0.0.js" defer type="module"> </script>
</body>
</html>
{{ define "scripts" }}{{ end }}
{{ define "styles" }}{{ end }}

View File

@ -1,12 +1,16 @@
{{ define "title" }}
<title>Christine Dodrill</title>
@use super::{header_html, footer_html};
@()
@:header_html(None, None)
<link rel="authorization_endpoint" href="https://idp.christine.website/auth">
<link rel="canonical" href="https://christine.website/">
<meta name="google-site-verification" content="rzs9eBEquMYr9Phrg0Xm0mIwFjDBcbdgJ3jF6Disy-k" />
<script type="application/ld+json">
{
"@context": "http://schema.org/",
"@type": "Person",
@{
"@@context": "http://schema.org/",
"@@type": "Person",
"name": "Christine Dodrill",
"alternateName": "Cadey, Xe, Xena",
"url": "https://christine.website",
@ -15,16 +19,16 @@
"https://github.com/Xe",
"https://git.xeserv.us/xena",
"https://twitter.com/theprincessxena",
"https://mst3k.interlinked.me/@cadey",
"https://mst3k.interlinked.me/@@cadey",
"https://www.linkedin.com/in/christine-dodrill-1827a010b/",
"https://www.youtube.com/user/shadowh511"
]
}
@}
</script>
<!-- Twitter -->
<meta name="twitter:card" content="summary" />
<meta name="twitter:site" content="@theprincessxena" />
<meta name="twitter:site" content="@@theprincessxena" />
<meta name="twitter:title" content="Christine Dodrill" />
<meta name="twitter:description" content="Full-stack Engineer" />
@ -36,9 +40,7 @@
<!-- Description -->
<meta name="description" content="Full-stack Engineer" />
<meta name="author" content="Christine Dodrill">
{{ end }}
{{ define "content" }}
<div class="grid">
<div class="cell -3of12 content">
<img src="/static/img/avatar.png" alt="My Avatar">
@ -78,11 +80,12 @@
<ul>
<li><a href="https://github.com/Xe" rel="me">GitHub</a></li>
<li><a href="https://twitter.com/theprincessxena" rel="me">Twitter</a></li>
<li><a href="https://mst3k.interlinked.me/@cadey" rel="me">Mastodon</a></li>
<li><a href="https://mst3k.interlinked.me/@@cadey" rel="me">Mastodon</a></li>
<li><a href="https://www.patreon.com/cadey" rel="me">Patreon</a></li>
</ul>
<p>Looking for someone for your team? Check <a href="/signalboost">here</a>.
</div>
</div>
{{ end }}
@:footer_html()

View File

@ -0,0 +1,11 @@
@use super::{header_html, footer_html};
@(path: String)
@:header_html(Some("Not Found"), None)
<h1>Not Found</h1>
<p>The path at <code>@path</code> could not be found. If you expected this path to exist, please <a href="https://github.com/Xe/site/issues/new">report this issue</a> so it can be fixed.</p>
@:footer_html()

Some files were not shown because too many files have changed in this diff Show More