Compare commits
105 Commits
Author | SHA1 | Date | |
---|---|---|---|
e4f9bd7b8d | |||
83a667347d | |||
499a56aed4 | |||
928907086c | |||
dc9fbc1a05 | |||
7ae7a88eed | |||
536a766960 | |||
e34329cfee | |||
97a0680bd0 | |||
c1560f4eba | |||
242ffab0da | |||
1211d781d0 | |||
9e4066658c | |||
560de4e91f | |||
bd1c890961 | |||
6f799b2617 | |||
38e176f59f | |||
8248eca95c | |||
ffc3bfe72d | |||
d713746407 | |||
21b45760eb | |||
e3fb93946a | |||
b6134a39d0 | |||
c844655c98 | |||
cac83493da | |||
b47cfd1ba5 | |||
28ca3589ed | |||
3cf787cf98 | |||
46e22cf74e | |||
5c2e134924 | |||
c6244585fa | |||
9f1ba274eb | |||
93ed1c6f0c | |||
6ce82c36fb | |||
2974d92e30 | |||
34dfcb2512 | |||
6b42e97bda | |||
49bfe4d27c | |||
c8965ae51b | |||
0b64a52a63 | |||
a18db1e2b7 | |||
3b53e5be4c | |||
42e8351285 | |||
b3e4b7bfae | |||
4a42a25ed3 | |||
2bacaa163f | |||
48c3a8c0d0 | |||
c23d2dc50b | |||
46dbd88d91 | |||
f0f484288e | |||
90d39b121f | |||
44dee302c9 | |||
c7f9386c01 | |||
66ac72beab | |||
14f9ac4ca7 | |||
6a7f725c12 | |||
2533e8cef5 | |||
772d20615b | |||
0bb1677520 | |||
da4883db29 | |||
d50b6ca4b3 | |||
4c66e75f6b | |||
966e0ce921 | |||
ab886d1e67 | |||
dc7e087ed0 | |||
0d6e18d97d | |||
f872baa1fe | |||
9b5176f7b9 | |||
60c89197e5 | |||
7d94876d90 | |||
467342edf4 | |||
3c5b4037e2 | |||
a81502dde1 | |||
0ce2d9054e | |||
a5203fe8dd | |||
038fafa378 | |||
e15c15c390 | |||
07363e47a9 | |||
fb7faf6477 | |||
b14b4ff551 | |||
4b1195f221 | |||
a472461ee8 | |||
baf5e3d7ee | |||
f209757ed6 | |||
4173362ce1 | |||
b2ae5e486f | |||
cda19a1912 | |||
f0b69f8a4a | |||
118ff9485c | |||
4a51b7cfb0 | |||
f877face80 | |||
f0e8deb000 | |||
e70dfe2c0b | |||
2e89a85858 | |||
e59b2b1346 | |||
1c36549134 | |||
5fb88d4744 | |||
6c7188a1b9 | |||
84009fbb8e | |||
bf783c2f3a | |||
213babb057 | |||
7dbc02d930 | |||
10873dd0c6 | |||
c0d1519341 | |||
8709c49f39 |
27
.github/ISSUE_TEMPLATE/bug_report.md
vendored
27
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -1,24 +1,33 @@
|
||||
---
|
||||
name: Bug report
|
||||
name: 🐛 Bug report
|
||||
about: Create a report to help us improve
|
||||
title: Bug Report | [title]
|
||||
title: ''
|
||||
labels: bug
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
## Describe the bug
|
||||
<!--
|
||||
A clear and concise description of what the bug is.
|
||||
-->
|
||||
|
||||
**To reproduce**
|
||||
## To reproduce
|
||||
|
||||
<!--
|
||||
Steps to reproduce the behavior:
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
-->
|
||||
|
||||
**Expected behavior**
|
||||
A clear and concise description of what you expected to happen.
|
||||
## Expected behavior
|
||||
<!--
|
||||
A clear and concise description of what you expected to happen.
|
||||
-->
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here.
|
||||
## Additional context
|
||||
<!--
|
||||
Add any other context about the problem here.
|
||||
-->
|
||||
|
28
.github/ISSUE_TEMPLATE/feature_request.md
vendored
28
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@ -1,20 +1,28 @@
|
||||
---
|
||||
name: Feature request
|
||||
name: 💡 Feature request
|
||||
about: Suggest an idea for this project
|
||||
title: Feature Request | [title]
|
||||
title: ''
|
||||
labels: enhancement
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
## Is your feature request related to a problem? Please describe.
|
||||
<!--
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
-->
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
## Describe the solution you'd like
|
||||
<!--
|
||||
A clear and concise description of what you want to happen.
|
||||
-->
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
## Describe alternatives you've considered
|
||||
<!--
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
-->
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
## Additional context
|
||||
<!--
|
||||
Add any other context or screenshots about the feature request here.
|
||||
-->
|
||||
|
@ -1,4 +1,4 @@
|
||||
name: Docker Multi-Architecture Build
|
||||
name: Docker ARM Build
|
||||
|
||||
on:
|
||||
push:
|
||||
@ -30,7 +30,7 @@ jobs:
|
||||
uses: docker/build-push-action@v2
|
||||
with:
|
||||
context: .
|
||||
file: ./Dockerfile
|
||||
platforms: linux/amd64,linux/arm64
|
||||
file: ./Dockerfile.arm
|
||||
platforms: linux/arm64
|
||||
push: true
|
||||
tags: spikecodes/libreddit:latest
|
||||
tags: spikecodes/libreddit:arm
|
26
.github/workflows/rust.yml
vendored
26
.github/workflows/rust.yml
vendored
@ -27,3 +27,29 @@ jobs:
|
||||
with:
|
||||
name: libreddit
|
||||
path: target/release/libreddit
|
||||
|
||||
- name: Versions
|
||||
id: version
|
||||
run: |
|
||||
echo "::set-output name=version::$(cargo metadata --format-version 1 --no-deps | jq .packages[0].version -r | sed 's/^/v/')"
|
||||
echo "::set-output name=tag::$(git describe --tags)"
|
||||
|
||||
- name: Calculate SHA512 checksum
|
||||
run: sha512sum target/release/libreddit > libreddit.sha512
|
||||
|
||||
- name: Release
|
||||
uses: softprops/action-gh-release@v1
|
||||
if: github.base_ref != 'master'
|
||||
with:
|
||||
tag_name: ${{ steps.version.outputs.version }}
|
||||
name: ${{ steps.version.outputs.version }} - NAME
|
||||
draft: true
|
||||
files: |
|
||||
target/release/libreddit
|
||||
libreddit.sha512
|
||||
body: |
|
||||
- ${{ github.event.head_commit.message }} ${{ github.sha }}
|
||||
|
||||
See full list of changes [here](https://github.com/spikecodes/libreddit/compare/${{ steps.version.outputs.tag }}...${{ steps.version.outputs.version }}).
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
|
1546
Cargo.lock
generated
1546
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
22
Cargo.toml
22
Cargo.toml
@ -3,19 +3,23 @@ name = "libreddit"
|
||||
description = " Alternative private front-end to Reddit"
|
||||
license = "AGPL-3.0"
|
||||
repository = "https://github.com/spikecodes/libreddit"
|
||||
version = "0.3.1"
|
||||
version = "0.13.0"
|
||||
authors = ["spikecodes <19519553+spikecodes@users.noreply.github.com>"]
|
||||
edition = "2018"
|
||||
|
||||
[dependencies]
|
||||
tide = { version = "0.16.0", default-features = false, features = ["h1-server", "cookies"] }
|
||||
async-std = { version = "1.9.0", features = ["attributes"] }
|
||||
surf = { version = "2.2.0", default-features = false, features = ["curl-client", "encoding"] }
|
||||
cached = "0.23.0"
|
||||
askama = { version = "0.10.5", default-features = false }
|
||||
serde = { version = "1.0.123", features = ["derive"] }
|
||||
serde_json = "1.0.64"
|
||||
async-recursion = "0.3.2"
|
||||
regex = "1.4.3"
|
||||
cached = "0.23.0"
|
||||
clap = { version = "2.33.3", default-features = false }
|
||||
time = "0.2.25"
|
||||
regex = "1.5.4"
|
||||
serde = { version = "1.0.126", features = ["derive"] }
|
||||
cookie = "0.15.0"
|
||||
futures-lite = "1.11.3"
|
||||
hyper = { version = "0.14.7", features = ["full"] }
|
||||
hyper-rustls = "0.22.1"
|
||||
route-recognizer = "0.3.0"
|
||||
serde_json = "1.0.64"
|
||||
tokio = { version = "1.6.0", features = ["full"] }
|
||||
time = "0.2.26"
|
||||
url = "2.2.2"
|
||||
|
33
Dockerfile
33
Dockerfile
@ -1,17 +1,36 @@
|
||||
FROM rust:latest as builder
|
||||
####################################################################################################
|
||||
## Builder
|
||||
####################################################################################################
|
||||
FROM rust:alpine AS builder
|
||||
|
||||
RUN apk add --no-cache musl-dev
|
||||
|
||||
WORKDIR /libreddit
|
||||
|
||||
WORKDIR /usr/src/libreddit
|
||||
COPY . .
|
||||
RUN cargo install --path .
|
||||
|
||||
RUN cargo build --target x86_64-unknown-linux-musl --release
|
||||
|
||||
FROM debian:buster-slim
|
||||
####################################################################################################
|
||||
## Final image
|
||||
####################################################################################################
|
||||
FROM alpine:latest
|
||||
|
||||
RUN apt-get update && apt-get install -y libcurl4 && rm -rf /var/lib/apt/lists/*
|
||||
COPY --from=builder /usr/local/cargo/bin/libreddit /usr/local/bin/libreddit
|
||||
RUN useradd --system --user-group --home-dir /nonexistent --no-create-home --shell /usr/sbin/nologin libreddit
|
||||
# Import ca-certificates from builder
|
||||
COPY --from=builder /usr/share/ca-certificates /usr/share/ca-certificates
|
||||
COPY --from=builder /etc/ssl/certs /etc/ssl/certs
|
||||
|
||||
# Copy our build
|
||||
COPY --from=builder /libreddit/target/x86_64-unknown-linux-musl/release/libreddit /usr/local/bin/libreddit
|
||||
|
||||
# Use an unprivileged user.
|
||||
RUN adduser --home /nonexistent --no-create-home --disabled-password libreddit
|
||||
USER libreddit
|
||||
|
||||
# Tell Docker to expose port 8080
|
||||
EXPOSE 8080
|
||||
|
||||
# Run a healthcheck every minute to make sure Libreddit is functional
|
||||
HEALTHCHECK --interval=1m --timeout=3s CMD wget --spider --q http://localhost:8080/settings || exit 1
|
||||
|
||||
CMD ["libreddit"]
|
36
Dockerfile.arm
Normal file
36
Dockerfile.arm
Normal file
@ -0,0 +1,36 @@
|
||||
####################################################################################################
|
||||
## Builder
|
||||
####################################################################################################
|
||||
FROM rust:alpine AS builder
|
||||
|
||||
RUN apk add --no-cache g++
|
||||
|
||||
WORKDIR /usr/src/libreddit
|
||||
|
||||
COPY . .
|
||||
|
||||
RUN cargo install --path .
|
||||
|
||||
####################################################################################################
|
||||
## Final image
|
||||
####################################################################################################
|
||||
FROM alpine:latest
|
||||
|
||||
# Import ca-certificates from builder
|
||||
COPY --from=builder /usr/share/ca-certificates /usr/share/ca-certificates
|
||||
COPY --from=builder /etc/ssl/certs /etc/ssl/certs
|
||||
|
||||
# Copy our build
|
||||
COPY --from=builder /usr/local/cargo/bin/libreddit /usr/local/bin/libreddit
|
||||
|
||||
# Use an unprivileged user.
|
||||
RUN adduser --home /nonexistent --no-create-home --disabled-password libreddit
|
||||
USER libreddit
|
||||
|
||||
# Tell Docker to expose port 8080
|
||||
EXPOSE 8080
|
||||
|
||||
# Run a healthcheck every minute to make sure Libreddit is functional
|
||||
HEALTHCHECK --interval=1m --timeout=3s CMD wget --spider --q http://localhost:8080/settings || exit 1
|
||||
|
||||
CMD ["libreddit"]
|
63
README.md
63
README.md
@ -2,11 +2,11 @@
|
||||
|
||||
> An alternative private front-end to Reddit
|
||||
|
||||

|
||||

|
||||
|
||||
---
|
||||
|
||||
**10 second pitch:** Libreddit is a portmanteau of "libre" (meaning freedom) and "Reddit". It is a private front-end like [Invidious](https://github.com/iv-org/invidious) but for Reddit. Browse the coldest takes of [r/unpopularopinion](https://libredd.it/r/unpopularopinion) without being [tracked](#reddit).
|
||||
**10 second pitch:** Libreddit is a portmanteau of "libre" (meaning freedom) and "Reddit". It is a private front-end like [Invidious](https://github.com/iv-org/invidious) but for Reddit. Browse the coldest takes of [r/unpopularopinion](https://libreddit.spike.codes/r/unpopularopinion) without being [tracked](#reddit).
|
||||
|
||||
- 🚀 Fast: written in Rust for blazing fast speeds and memory safety
|
||||
- ☁️ Light: no JavaScript, no ads, no tracking, no bloat
|
||||
@ -21,20 +21,6 @@
|
||||
|
||||
---
|
||||
|
||||
## Jump to...
|
||||
- [About](#about)
|
||||
- [Teddit Comparison](#how-does-it-compare-to-teddit)
|
||||
- [Comparison](#comparison)
|
||||
- [Installation](#installation)
|
||||
- [Cargo](#1-cargo)
|
||||
- [Docker](#2-docker)
|
||||
- [AUR](#3-aur)
|
||||
- [GitHub Releases](#4-github-releases)
|
||||
- [Repl.it](#5-replit)
|
||||
- [Deployment](#deployment)
|
||||
|
||||
---
|
||||
|
||||
# Instances
|
||||
|
||||
Feel free to [open an issue](https://github.com/spikecodes/libreddit/issues/new) to have your [selfhosted instance](#deployment) listed here!
|
||||
@ -43,14 +29,21 @@ Feel free to [open an issue](https://github.com/spikecodes/libreddit/issues/new)
|
||||
|-|-|-|
|
||||
| [libredd.it](https://libredd.it) (official) | 🇺🇸 US | |
|
||||
| [libreddit.spike.codes](https://libreddit.spike.codes) (official) | 🇺🇸 US | |
|
||||
| [libreddit.dothq.co](https://libreddit.dothq.co) | 🇺🇸 US | ✅ |
|
||||
| [libreddit.dothq.co](https://libreddit.dothq.co) | 🇺🇸 US | |
|
||||
| [libreddit.kavin.rocks](https://libreddit.kavin.rocks) | 🇮🇳 IN | ✅ |
|
||||
| [libreddit.himiko.cloud](https://libreddit.himiko.cloud) | 🇧🇬 BG | |
|
||||
| [libreddit.bcow.xyz](https://libreddit.bcow.xyz) | 🇺🇸 US | |
|
||||
| [libreddit.40two.app](https://libreddit.40two.app) | 🇳🇱 NL | |
|
||||
| [reddit.invak.id](https://reddit.invak.id) | 🇧🇬 BG | |
|
||||
| [reddit.phii.me](https://reddit.phii.me) | 🇺🇸 US | |
|
||||
| [lr.riverside.rocks](https://lr.riverside.rocks) | 🇺🇸 US | |
|
||||
| [libreddit.silkky.cloud](https://libreddit.silkky.cloud) | 🇫🇮 FI | |
|
||||
| [libreddit.database.red](https://libreddit.database.red) | 🇺🇸 US | ✅ |
|
||||
| [libreddit.exonip.de](https://libreddit.exonip.de) | 🇩🇪 DE | |
|
||||
| [libreddit.domain.glass](https://libreddit.domain.glass) | 🇺🇸 US | ✅ |
|
||||
| [spjmllawtheisznfs7uryhxumin26ssv2draj7oope3ok3wuhy43eoyd.onion](http://spjmllawtheisznfs7uryhxumin26ssv2draj7oope3ok3wuhy43eoyd.onion) | 🇮🇳 IN | |
|
||||
| [fwhhsbrbltmrct5hshrnqlqygqvcgmnek3cnka55zj4y7nuus5muwyyd.onion](http://fwhhsbrbltmrct5hshrnqlqygqvcgmnek3cnka55zj4y7nuus5muwyyd.onion) | 🇩🇪 DE | |
|
||||
| [libreddit.himiko7xl2skojc6odi7hykl626gt4qki3vxdbv33u2u3af76d6k32ad.onion](http://libreddit.himiko7xl2skojc6odi7hykl626gt4qki3vxdbv33u2u3af76d6k32ad.onion) | 🇧🇬 BG | |
|
||||
| [dflv6yjt7il3n3tggf4qhcmkzbti2ppytqx3o7pjrzwgntutpewscyid.onion](http://dflv6yjt7il3n3tggf4qhcmkzbti2ppytqx3o7pjrzwgntutpewscyid.onion/) | 🇺🇸 US | |
|
||||
| [kphht2jcflojtqte4b4kyx7p2ahagv4debjj32nre67dxz7y57seqwyd.onion](http://kphht2jcflojtqte4b4kyx7p2ahagv4debjj32nre67dxz7y57seqwyd.onion/) | 🇳🇱 NL | |
|
||||
|
||||
A checkmark in the "Cloudflare" category here refers to the use of the reverse proxy, [Cloudflare](https://cloudflare). The checkmark will not be listed for a site which uses Cloudflare DNS but rather the proxying service which grants Cloudflare the ability to monitor traffic to the website.
|
||||
|
||||
@ -63,9 +56,9 @@ Find Libreddit on 💬 [Matrix](https://matrix.to/#/#libreddit:kde.org), 🐋 [D
|
||||
## Built with
|
||||
|
||||
- [Rust](https://www.rust-lang.org/) - Programming language
|
||||
- [Tide](https://github.com/http-rs/tide) - Web server
|
||||
- [Hyper](https://github.com/hyperium/hyper) - HTTP server and client
|
||||
- [Askama](https://github.com/djc/askama) - Templating engine
|
||||
- [Surf](https://github.com/http-rs/surf) - HTTP client
|
||||
- [Rustls](https://github.com/ctz/rustls) - TLS library
|
||||
|
||||
## Info
|
||||
Libreddit hopes to provide an easier way to browse Reddit, without the ads, trackers, and bloat. Libreddit was inspired by other alternative front-ends to popular services such as [Invidious](https://github.com/iv-org/invidious) for YouTube, [Nitter](https://github.com/zedeus/nitter) for Twitter, and [Bibliogram](https://sr.ht/~cadence/bibliogram/) for Instagram.
|
||||
@ -78,7 +71,7 @@ Teddit is another awesome open source project designed to provide an alternative
|
||||
|
||||
If you are looking to compare, the biggest differences I have noticed are:
|
||||
- Libreddit is themed around Reddit's redesign whereas Teddit appears to stick much closer to Reddit's old design. This may suit some users better as design is always subjective.
|
||||
- Libreddit is written in [Rust](https://www.rust-lang.org) for speed and memory safety. It uses [Actix Web](https://actix.rs), which was [benchmarked as the fastest web server for single queries](https://www.techempower.com/benchmarks/#hw=ph&test=db).
|
||||
- Libreddit is written in [Rust](https://www.rust-lang.org) for speed and memory safety. It uses [Hyper](https://hyper.rs), a speedy and lightweight HTTP server/client implementation.
|
||||
|
||||
---
|
||||
|
||||
@ -137,9 +130,9 @@ For transparency, I hope to describe all the ways Libreddit handles user privacy
|
||||
|
||||
**DNS:** Both official domains (`libredd.it` and `libreddit.spike.codes`) use Cloudflare as the DNS resolver. Though, the sites are not proxied through Cloudflare meaning Cloudflare doesn't have access to user traffic.
|
||||
|
||||
**Cookies:** Libreddit uses optional cookies to store any configured settings in [the settings menu](https://libredd.it/settings). This is not a cross-site cookie and the cookie holds no personal data, only a value of the possible layout.
|
||||
**Cookies:** Libreddit uses optional cookies to store any configured settings in [the settings menu](https://libreddit.spike.codes/settings). This is not a cross-site cookie and the cookie holds no personal data, only a value of the possible layout.
|
||||
|
||||
**Hosting:** The official instances are hosted on [Repl.it](https://repl.it/) which monitors usage to prevent abuse. I can understand if this invalidates certain users' threat models and therefore, selfhosting and browsing through Tor are welcomed.
|
||||
**Hosting:** The official instances are hosted on [Replit](https://replit.com/) which monitors usage to prevent abuse. I can understand if this invalidates certain users' threat models and therefore, selfhosting and browsing through Tor are welcomed.
|
||||
|
||||
---
|
||||
|
||||
@ -157,14 +150,18 @@ cargo install libreddit
|
||||
|
||||
Deploy the [Docker image](https://hub.docker.com/r/spikecodes/libreddit) of Libreddit:
|
||||
```
|
||||
docker pull spikecodes/libreddit
|
||||
docker run -d --name libreddit -p 8080:8080 spikecodes/libreddit
|
||||
```
|
||||
|
||||
Deploy using a different port (in this case, port 80):
|
||||
```
|
||||
docker pull spikecodes/libreddit
|
||||
docker run -d --name libreddit -p 80:8080 spikecodes/libreddit
|
||||
```
|
||||
|
||||
To deploy on `arm64` platforms, simply replace `spikecodes/libreddit` in the commands above with `spikecodes/libreddit:arm`.
|
||||
|
||||
## 3) AUR
|
||||
|
||||
For ArchLinux users, Libreddit is available from the AUR as [`libreddit-git`](https://aur.archlinux.org/packages/libreddit-git).
|
||||
@ -177,15 +174,15 @@ yay -S libreddit-git
|
||||
|
||||
If you're on Linux and none of these methods work for you, you can grab a Linux binary from [the newest release](https://github.com/spikecodes/libreddit/releases/latest).
|
||||
|
||||
## 5) Repl.it
|
||||
## 5) Replit
|
||||
|
||||
**Note:** Repl.it is a free option but they are *not* private and will monitor server usage to prevent abuse. If you need a free and easy setup, this method may work best for you.
|
||||
**Note:** Replit is a free option but they are *not* private and will monitor server usage to prevent abuse. If you need a free and easy setup, this method may work best for you.
|
||||
|
||||
1. Create a Repl.it account (see note above)
|
||||
2. Visit [the official Repl](https://repl.it/@spikethecoder/libreddit) and fork it
|
||||
1. Create a Replit account (see note above)
|
||||
2. Visit [the official Repl](https://replit.com/@spikethecoder/libreddit) and fork it
|
||||
3. Hit the run button to download the latest Libreddit version and start it
|
||||
|
||||
In the web preview (defaults to top right), you should see your instance hosted where you can assign a [custom domain](https://docs.repl.it/repls/web-hosting#custom-domains).
|
||||
In the web preview (defaults to top right), you should see your instance hosted where you can assign a [custom domain](https://docs.replit.com/repls/web-hosting#custom-domains).
|
||||
|
||||
---
|
||||
|
||||
@ -197,6 +194,14 @@ Once installed, deploy Libreddit to `0.0.0.0:8080` by running:
|
||||
libreddit
|
||||
```
|
||||
|
||||
## Proxying using NGINX
|
||||
|
||||
**NOTE** If you're [proxying Libreddit through a NGINX Reverse Proxy](https://github.com/spikecodes/libreddit/issues/122#issuecomment-782226853), add
|
||||
```nginx
|
||||
proxy_http_version 1.1;
|
||||
```
|
||||
to your NGINX configuration file above your `proxy_pass` line.
|
||||
|
||||
## Building
|
||||
|
||||
```
|
||||
|
13
docker-compose.yml
Normal file
13
docker-compose.yml
Normal file
@ -0,0 +1,13 @@
|
||||
version: "3.8"
|
||||
|
||||
services:
|
||||
web:
|
||||
build: .
|
||||
restart: always
|
||||
container_name: "libreddit"
|
||||
ports:
|
||||
- 8080:8080
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8080/settings"]
|
||||
interval: 5m
|
||||
timeout: 3s
|
153
src/client.rs
Normal file
153
src/client.rs
Normal file
@ -0,0 +1,153 @@
|
||||
use cached::proc_macro::cached;
|
||||
use futures_lite::{future::Boxed, FutureExt};
|
||||
use hyper::{body::Buf, client, Body, Request, Response, Uri};
|
||||
use serde_json::Value;
|
||||
use std::{result::Result, str::FromStr};
|
||||
|
||||
use crate::server::RequestExt;
|
||||
|
||||
pub async fn proxy(req: Request<Body>, format: &str) -> Result<Response<Body>, String> {
|
||||
let mut url = format!("{}?{}", format, req.uri().query().unwrap_or_default());
|
||||
|
||||
for (name, value) in req.params().iter() {
|
||||
url = url.replace(&format!("{{{}}}", name), value);
|
||||
}
|
||||
|
||||
stream(&url, &req).await
|
||||
}
|
||||
|
||||
async fn stream(url: &str, req: &Request<Body>) -> Result<Response<Body>, String> {
|
||||
// First parameter is target URL (mandatory).
|
||||
let url = Uri::from_str(url).map_err(|_| "Couldn't parse URL".to_string())?;
|
||||
|
||||
// Prepare the HTTPS connector.
|
||||
let https = hyper_rustls::HttpsConnector::with_native_roots();
|
||||
|
||||
// Build the hyper client from the HTTPS connector.
|
||||
let client: client::Client<_, hyper::Body> = client::Client::builder().build(https);
|
||||
|
||||
let mut builder = Request::get(url);
|
||||
|
||||
// Copy useful headers from original request
|
||||
let headers = req.headers();
|
||||
for &key in &["Range", "If-Modified-Since", "Cache-Control"] {
|
||||
if let Some(value) = headers.get(key) {
|
||||
builder = builder.header(key, value);
|
||||
}
|
||||
}
|
||||
|
||||
let stream_request = builder.body(Body::default()).expect("stream");
|
||||
|
||||
client
|
||||
.request(stream_request)
|
||||
.await
|
||||
.map(|mut res| {
|
||||
let mut rm = |key: &str| res.headers_mut().remove(key);
|
||||
|
||||
rm("access-control-expose-headers");
|
||||
rm("server");
|
||||
rm("vary");
|
||||
rm("etag");
|
||||
rm("x-cdn");
|
||||
rm("x-cdn-client-region");
|
||||
rm("x-cdn-name");
|
||||
rm("x-cdn-server-region");
|
||||
rm("x-reddit-cdn");
|
||||
rm("x-reddit-video-features");
|
||||
|
||||
res
|
||||
})
|
||||
.map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
fn request(url: String) -> Boxed<Result<Response<Body>, String>> {
|
||||
// Prepare the HTTPS connector.
|
||||
let https = hyper_rustls::HttpsConnector::with_native_roots();
|
||||
|
||||
// Build the hyper client from the HTTPS connector.
|
||||
let client: client::Client<_, hyper::Body> = client::Client::builder().build(https);
|
||||
|
||||
let builder = Request::builder()
|
||||
.method("GET")
|
||||
.uri(&url)
|
||||
.header("User-Agent", format!("web:libreddit:{}", env!("CARGO_PKG_VERSION")))
|
||||
.header("Host", "www.reddit.com")
|
||||
.header("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8")
|
||||
.header("Accept-Language", "en-US,en;q=0.5")
|
||||
.header("Connection", "keep-alive")
|
||||
.body(Body::empty());
|
||||
|
||||
async move {
|
||||
match builder {
|
||||
Ok(req) => match client.request(req).await {
|
||||
Ok(response) => {
|
||||
if response.status().to_string().starts_with('3') {
|
||||
request(
|
||||
response
|
||||
.headers()
|
||||
.get("Location")
|
||||
.map(|val| val.to_str().unwrap_or_default())
|
||||
.unwrap_or_default()
|
||||
.to_string(),
|
||||
)
|
||||
.await
|
||||
} else {
|
||||
Ok(response)
|
||||
}
|
||||
}
|
||||
Err(e) => Err(e.to_string()),
|
||||
},
|
||||
Err(_) => Err("Post url contains non-ASCII characters".to_string()),
|
||||
}
|
||||
}
|
||||
.boxed()
|
||||
}
|
||||
|
||||
// Make a request to a Reddit API and parse the JSON response
|
||||
#[cached(size = 100, time = 30, result = true)]
|
||||
pub async fn json(path: String) -> Result<Value, String> {
|
||||
// Build Reddit url from path
|
||||
let url = format!("https://www.reddit.com{}", path);
|
||||
|
||||
// Closure to quickly build errors
|
||||
let err = |msg: &str, e: String| -> Result<Value, String> {
|
||||
// eprintln!("{} - {}: {}", url, msg, e);
|
||||
Err(format!("{}: {}", msg, e))
|
||||
};
|
||||
|
||||
// Fetch the url...
|
||||
match request(url.clone()).await {
|
||||
Ok(response) => {
|
||||
// asynchronously aggregate the chunks of the body
|
||||
match hyper::body::aggregate(response).await {
|
||||
Ok(body) => {
|
||||
// Parse the response from Reddit as JSON
|
||||
match serde_json::from_reader(body.reader()) {
|
||||
Ok(value) => {
|
||||
let json: Value = value;
|
||||
// If Reddit returned an error
|
||||
if json["error"].is_i64() {
|
||||
Err(
|
||||
json["reason"]
|
||||
.as_str()
|
||||
.unwrap_or_else(|| {
|
||||
json["message"].as_str().unwrap_or_else(|| {
|
||||
eprintln!("{} - Error parsing reddit error", url);
|
||||
"Error parsing reddit error"
|
||||
})
|
||||
})
|
||||
.to_string(),
|
||||
)
|
||||
} else {
|
||||
Ok(json)
|
||||
}
|
||||
}
|
||||
Err(e) => err("Failed to parse page JSON data", e.to_string()),
|
||||
}
|
||||
}
|
||||
Err(e) => err("Failed receiving body from Reddit", e.to_string()),
|
||||
}
|
||||
}
|
||||
Err(e) => err("Couldn't send request to Reddit", e),
|
||||
}
|
||||
}
|
312
src/main.rs
312
src/main.rs
@ -1,6 +1,17 @@
|
||||
// Global specifiers
|
||||
#![forbid(unsafe_code)]
|
||||
#![warn(clippy::pedantic, clippy::all)]
|
||||
#![allow(
|
||||
clippy::needless_pass_by_value,
|
||||
clippy::match_wildcard_for_single_variants,
|
||||
clippy::cast_possible_truncation,
|
||||
clippy::similar_names,
|
||||
clippy::cast_possible_wrap,
|
||||
clippy::find_map
|
||||
)]
|
||||
|
||||
// Reference local files
|
||||
mod post;
|
||||
mod proxy;
|
||||
mod search;
|
||||
mod settings;
|
||||
mod subreddit;
|
||||
@ -8,99 +19,72 @@ mod user;
|
||||
mod utils;
|
||||
|
||||
// Import Crates
|
||||
use clap::{App, Arg};
|
||||
use proxy::handler;
|
||||
use tide::{
|
||||
utils::{async_trait, After},
|
||||
Middleware, Next, Request, Response,
|
||||
};
|
||||
use clap::{App as cli, Arg};
|
||||
|
||||
use futures_lite::FutureExt;
|
||||
use hyper::{header::HeaderValue, Body, Request, Response};
|
||||
|
||||
mod client;
|
||||
use client::proxy;
|
||||
use server::RequestExt;
|
||||
use utils::{error, redirect};
|
||||
|
||||
// Build middleware
|
||||
struct HttpsRedirect<HttpsOnly>(HttpsOnly);
|
||||
struct NormalizePath;
|
||||
|
||||
#[async_trait]
|
||||
impl<State, HttpsOnly> Middleware<State> for HttpsRedirect<HttpsOnly>
|
||||
where
|
||||
State: Clone + Send + Sync + 'static,
|
||||
HttpsOnly: Into<bool> + Copy + Send + Sync + 'static,
|
||||
{
|
||||
async fn handle(&self, request: Request<State>, next: Next<'_, State>) -> tide::Result {
|
||||
let secure = request.url().scheme() == "https";
|
||||
|
||||
if self.0.into() && !secure {
|
||||
let mut secured = request.url().to_owned();
|
||||
secured.set_scheme("https").unwrap_or_default();
|
||||
|
||||
Ok(redirect(secured.to_string()))
|
||||
} else {
|
||||
Ok(next.run(request).await)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl<State: Clone + Send + Sync + 'static> Middleware<State> for NormalizePath {
|
||||
async fn handle(&self, request: Request<State>, next: Next<'_, State>) -> tide::Result {
|
||||
let path = request.url().path();
|
||||
let query = request.url().query().unwrap_or_default();
|
||||
if path.ends_with('/') {
|
||||
Ok(next.run(request).await)
|
||||
} else {
|
||||
let normalized = if query != "" {
|
||||
format!("{}/?{}", path.replace("//", "/"), query)
|
||||
} else {
|
||||
format!("{}/", path.replace("//", "/"))
|
||||
};
|
||||
Ok(redirect(normalized))
|
||||
}
|
||||
}
|
||||
}
|
||||
mod server;
|
||||
|
||||
// Create Services
|
||||
|
||||
// Required for the manifest to be valid
|
||||
async fn pwa_logo(_req: Request<()>) -> tide::Result {
|
||||
Ok(Response::builder(200).content_type("image/png").body(include_bytes!("../static/logo.png").as_ref()).build())
|
||||
async fn pwa_logo() -> Result<Response<Body>, String> {
|
||||
Ok(
|
||||
Response::builder()
|
||||
.status(200)
|
||||
.header("content-type", "image/png")
|
||||
.body(include_bytes!("../static/logo.png").as_ref().into())
|
||||
.unwrap_or_default(),
|
||||
)
|
||||
}
|
||||
|
||||
// Required for iOS App Icons
|
||||
async fn iphone_logo(_req: Request<()>) -> tide::Result {
|
||||
async fn iphone_logo() -> Result<Response<Body>, String> {
|
||||
Ok(
|
||||
Response::builder(200)
|
||||
.content_type("image/png")
|
||||
.body(include_bytes!("../static/apple-touch-icon.png").as_ref())
|
||||
.build(),
|
||||
Response::builder()
|
||||
.status(200)
|
||||
.header("content-type", "image/png")
|
||||
.body(include_bytes!("../static/apple-touch-icon.png").as_ref().into())
|
||||
.unwrap_or_default(),
|
||||
)
|
||||
}
|
||||
|
||||
async fn favicon(_req: Request<()>) -> tide::Result {
|
||||
async fn favicon() -> Result<Response<Body>, String> {
|
||||
Ok(
|
||||
Response::builder(200)
|
||||
.content_type("image/vnd.microsoft.icon")
|
||||
Response::builder()
|
||||
.status(200)
|
||||
.header("content-type", "image/vnd.microsoft.icon")
|
||||
.header("Cache-Control", "public, max-age=1209600, s-maxage=86400")
|
||||
.body(include_bytes!("../static/favicon.ico").as_ref())
|
||||
.build(),
|
||||
.body(include_bytes!("../static/favicon.ico").as_ref().into())
|
||||
.unwrap_or_default(),
|
||||
)
|
||||
}
|
||||
|
||||
async fn resource(body: &str, content_type: &str, cache: bool) -> tide::Result {
|
||||
let mut res = Response::new(200);
|
||||
async fn resource(body: &str, content_type: &str, cache: bool) -> Result<Response<Body>, String> {
|
||||
let mut res = Response::builder()
|
||||
.status(200)
|
||||
.header("content-type", content_type)
|
||||
.body(body.to_string().into())
|
||||
.unwrap_or_default();
|
||||
|
||||
if cache {
|
||||
res.insert_header("Cache-Control", "public, max-age=1209600, s-maxage=86400");
|
||||
if let Ok(val) = HeaderValue::from_str("public, max-age=1209600, s-maxage=86400") {
|
||||
res.headers_mut().insert("Cache-Control", val);
|
||||
}
|
||||
}
|
||||
|
||||
res.set_content_type(content_type);
|
||||
res.set_body(body);
|
||||
|
||||
Ok(res)
|
||||
}
|
||||
|
||||
#[async_std::main]
|
||||
async fn main() -> tide::Result<()> {
|
||||
let matches = App::new("Libreddit")
|
||||
#[tokio::main]
|
||||
async fn main() {
|
||||
let matches = cli::new("Libreddit")
|
||||
.version(env!("CARGO_PKG_VERSION"))
|
||||
.about("Private front-end for Reddit written in Rust ")
|
||||
.arg(
|
||||
@ -125,139 +109,157 @@ async fn main() -> tide::Result<()> {
|
||||
Arg::with_name("redirect-https")
|
||||
.short("r")
|
||||
.long("redirect-https")
|
||||
.help("Redirect all HTTP requests to HTTPS")
|
||||
.help("Redirect all HTTP requests to HTTPS (no longer functional)")
|
||||
.takes_value(false),
|
||||
)
|
||||
.arg(
|
||||
Arg::with_name("hsts")
|
||||
.short("H")
|
||||
.long("hsts")
|
||||
.value_name("EXPIRE_TIME")
|
||||
.help("HSTS header to tell browsers that this site should only be accessed over HTTPS")
|
||||
.default_value("604800")
|
||||
.takes_value(true),
|
||||
)
|
||||
.get_matches();
|
||||
|
||||
let address = matches.value_of("address").unwrap_or("0.0.0.0");
|
||||
let port = matches.value_of("port").unwrap_or("8080");
|
||||
let force_https = matches.is_present("redirect-https");
|
||||
let hsts = matches.value_of("hsts");
|
||||
|
||||
let listener = format!("{}:{}", address, port);
|
||||
|
||||
println!("Starting Libreddit...");
|
||||
|
||||
// Start HTTP server
|
||||
let mut app = tide::new();
|
||||
// Begin constructing a server
|
||||
let mut app = server::Server::new();
|
||||
|
||||
// Redirect to HTTPS if "--redirect-https" enabled
|
||||
app.with(HttpsRedirect(force_https));
|
||||
// Define default headers (added to all responses)
|
||||
app.default_headers = headers! {
|
||||
"Referrer-Policy" => "no-referrer",
|
||||
"X-Content-Type-Options" => "nosniff",
|
||||
"X-Frame-Options" => "DENY",
|
||||
"Content-Security-Policy" => "default-src 'none'; script-src 'self' blob:; manifest-src 'self'; media-src 'self' data: blob: about:; style-src 'self' 'unsafe-inline'; base-uri 'none'; img-src 'self' data:; form-action 'self'; frame-ancestors 'none'; connect-src 'self'; worker-src blob:;"
|
||||
};
|
||||
|
||||
// Append trailing slash and remove double slashes
|
||||
app.with(NormalizePath);
|
||||
|
||||
// Apply default headers for security
|
||||
app.with(After(|mut res: Response| async move {
|
||||
res.insert_header("Referrer-Policy", "no-referrer");
|
||||
res.insert_header("X-Content-Type-Options", "nosniff");
|
||||
res.insert_header("X-Frame-Options", "DENY");
|
||||
res.insert_header(
|
||||
"Content-Security-Policy",
|
||||
"default-src 'none'; manifest-src 'self'; media-src 'self'; style-src 'self' 'unsafe-inline'; base-uri 'none'; img-src 'self' data:; form-action 'self'; frame-ancestors 'none';",
|
||||
);
|
||||
Ok(res)
|
||||
}));
|
||||
if let Some(expire_time) = hsts {
|
||||
if let Ok(val) = HeaderValue::from_str(&format!("max-age={}", expire_time)) {
|
||||
app.default_headers.insert("Strict-Transport-Security", val);
|
||||
}
|
||||
}
|
||||
|
||||
// Read static files
|
||||
app.at("/style.css/").get(|_| resource(include_str!("../static/style.css"), "text/css", false));
|
||||
app.at("/style.css").get(|_| resource(include_str!("../static/style.css"), "text/css", false).boxed());
|
||||
app
|
||||
.at("/manifest.json/")
|
||||
.get(|_| resource(include_str!("../static/manifest.json"), "application/json", false));
|
||||
app.at("/robots.txt/").get(|_| resource("User-agent: *\nAllow: /", "text/plain", true));
|
||||
app.at("/favicon.ico/").get(favicon);
|
||||
app.at("/logo.png/").get(pwa_logo);
|
||||
app.at("/touch-icon-iphone.png/").get(iphone_logo);
|
||||
app.at("/apple-touch-icon.png/").get(iphone_logo);
|
||||
.at("/manifest.json")
|
||||
.get(|_| resource(include_str!("../static/manifest.json"), "application/json", false).boxed());
|
||||
app.at("/robots.txt").get(|_| resource("User-agent: *\nAllow: /", "text/plain", true).boxed());
|
||||
app.at("/favicon.ico").get(|_| favicon().boxed());
|
||||
app.at("/logo.png").get(|_| pwa_logo().boxed());
|
||||
app.at("/touch-icon-iphone.png").get(|_| iphone_logo().boxed());
|
||||
app.at("/apple-touch-icon.png").get(|_| iphone_logo().boxed());
|
||||
app
|
||||
.at("/playHLSVideo.js")
|
||||
.get(|_| resource(include_str!("../static/playHLSVideo.js"), "text/javascript", false).boxed());
|
||||
app
|
||||
.at("/hls.min.js")
|
||||
.get(|_| resource(include_str!("../static/hls.min.js"), "text/javascript", false).boxed());
|
||||
|
||||
// Proxy media through Libreddit
|
||||
app
|
||||
.at("/vid/:id/:size/") /* */
|
||||
.get(|req| handler(req, "https://v.redd.it/{}/DASH_{}", vec!["id", "size"]));
|
||||
app
|
||||
.at("/img/:id/") /* */
|
||||
.get(|req| handler(req, "https://i.redd.it/{}", vec!["id"]));
|
||||
app
|
||||
.at("/thumb/:point/:id/") /* */
|
||||
.get(|req| handler(req, "https://{}.thumbs.redditmedia.com/{}", vec!["point", "id"]));
|
||||
app
|
||||
.at("/emoji/:id/:name/") /* */
|
||||
.get(|req| handler(req, "https://emoji.redditmedia.com/{}/{}", vec!["id", "name"]));
|
||||
app
|
||||
.at("/preview/:loc/:id/:query/")
|
||||
.get(|req| handler(req, "https://{}view.redd.it/{}?{}", vec!["loc", "id", "query"]));
|
||||
app
|
||||
.at("/style/*path/") /* */
|
||||
.get(|req| handler(req, "https://styles.redditmedia.com/{}", vec!["path"]));
|
||||
app
|
||||
.at("/static/*path/") /* */
|
||||
.get(|req| handler(req, "https://www.redditstatic.com/{}", vec!["path"]));
|
||||
app.at("/vid/:id/:size").get(|r| proxy(r, "https://v.redd.it/{id}/DASH_{size}").boxed());
|
||||
app.at("/hls/:id/*path").get(|r| proxy(r, "https://v.redd.it/{id}/{path}").boxed());
|
||||
app.at("/img/:id").get(|r| proxy(r, "https://i.redd.it/{id}").boxed());
|
||||
app.at("/thumb/:point/:id").get(|r| proxy(r, "https://{point}.thumbs.redditmedia.com/{id}").boxed());
|
||||
app.at("/emoji/:id/:name").get(|r| proxy(r, "https://emoji.redditmedia.com/{id}/{name}").boxed());
|
||||
app.at("/preview/:loc/:id").get(|r| proxy(r, "https://{loc}view.redd.it/{id}").boxed());
|
||||
app.at("/style/*path").get(|r| proxy(r, "https://styles.redditmedia.com/{path}").boxed());
|
||||
app.at("/static/*path").get(|r| proxy(r, "https://www.redditstatic.com/{path}").boxed());
|
||||
|
||||
// Browse user profile
|
||||
app.at("/u/:name/").get(user::profile);
|
||||
app.at("/u/:name/comments/:id/:title/").get(post::item);
|
||||
app.at("/u/:name/comments/:id/:title/:comment_id/").get(post::item);
|
||||
app
|
||||
.at("/u/:name")
|
||||
.get(|r| async move { Ok(redirect(format!("/user/{}", r.param("name").unwrap_or_default()))) }.boxed());
|
||||
app.at("/u/:name/comments/:id/:title").get(|r| post::item(r).boxed());
|
||||
app.at("/u/:name/comments/:id/:title/:comment_id").get(|r| post::item(r).boxed());
|
||||
|
||||
app.at("/user/:name/").get(user::profile);
|
||||
app.at("/user/:name/comments/:id/").get(post::item);
|
||||
app.at("/user/:name/comments/:id/:title/").get(post::item);
|
||||
app.at("/user/:name/comments/:id/:title/:comment_id/").get(post::item);
|
||||
app.at("/user/[deleted]").get(|req| error(req, "User has deleted their account".to_string()).boxed());
|
||||
app.at("/user/:name").get(|r| user::profile(r).boxed());
|
||||
app.at("/user/:name/comments/:id").get(|r| post::item(r).boxed());
|
||||
app.at("/user/:name/comments/:id/:title").get(|r| post::item(r).boxed());
|
||||
app.at("/user/:name/comments/:id/:title/:comment_id").get(|r| post::item(r).boxed());
|
||||
|
||||
// Configure settings
|
||||
app.at("/settings/").get(settings::get).post(settings::set);
|
||||
app.at("/settings/restore/").get(settings::restore);
|
||||
app.at("/settings").get(|r| settings::get(r).boxed()).post(|r| settings::set(r).boxed());
|
||||
app.at("/settings/restore").get(|r| settings::restore(r).boxed());
|
||||
app.at("/settings/update").get(|r| settings::update(r).boxed());
|
||||
|
||||
// Subreddit services
|
||||
app.at("/r/:sub/").get(subreddit::page);
|
||||
app.at("/r/:sub").get(|r| subreddit::community(r).boxed());
|
||||
|
||||
app.at("/r/:sub/subscribe/").post(subreddit::subscriptions);
|
||||
app.at("/r/:sub/unsubscribe/").post(subreddit::subscriptions);
|
||||
app
|
||||
.at("/r/u_:name")
|
||||
.get(|r| async move { Ok(redirect(format!("/user/{}", r.param("name").unwrap_or_default()))) }.boxed());
|
||||
|
||||
app.at("/r/:sub/comments/:id/").get(post::item);
|
||||
app.at("/r/:sub/comments/:id/:title/").get(post::item);
|
||||
app.at("/r/:sub/comments/:id/:title/:comment_id/").get(post::item);
|
||||
app.at("/r/:sub/subscribe").post(|r| subreddit::subscriptions(r).boxed());
|
||||
app.at("/r/:sub/unsubscribe").post(|r| subreddit::subscriptions(r).boxed());
|
||||
|
||||
app.at("/r/:sub/search/").get(search::find);
|
||||
app.at("/r/:sub/comments/:id").get(|r| post::item(r).boxed());
|
||||
app.at("/r/:sub/comments/:id/:title").get(|r| post::item(r).boxed());
|
||||
app.at("/r/:sub/comments/:id/:title/:comment_id").get(|r| post::item(r).boxed());
|
||||
|
||||
app.at("/r/:sub/wiki/").get(subreddit::wiki);
|
||||
app.at("/r/:sub/wiki/:page/").get(subreddit::wiki);
|
||||
app.at("/r/:sub/w/").get(subreddit::wiki);
|
||||
app.at("/r/:sub/w/:page/").get(subreddit::wiki);
|
||||
app.at("/r/:sub/search").get(|r| search::find(r).boxed());
|
||||
|
||||
app.at("/r/:sub/:sort/").get(subreddit::page);
|
||||
app
|
||||
.at("/r/:sub/w")
|
||||
.get(|r| async move { Ok(redirect(format!("/r/{}/wiki", r.param("sub").unwrap_or_default()))) }.boxed());
|
||||
app
|
||||
.at("/r/:sub/w/*page")
|
||||
.get(|r| async move { Ok(redirect(format!("/r/{}/wiki/{}", r.param("sub").unwrap_or_default(), r.param("wiki").unwrap_or_default()))) }.boxed());
|
||||
app.at("/r/:sub/wiki").get(|r| subreddit::wiki(r).boxed());
|
||||
app.at("/r/:sub/wiki/*page").get(|r| subreddit::wiki(r).boxed());
|
||||
|
||||
app.at("/r/:sub/about/sidebar").get(|r| subreddit::sidebar(r).boxed());
|
||||
|
||||
app.at("/r/:sub/:sort").get(|r| subreddit::community(r).boxed());
|
||||
|
||||
// Comments handler
|
||||
app.at("/comments/:id").get(|r| post::item(r).boxed());
|
||||
|
||||
// Front page
|
||||
app.at("/").get(subreddit::page);
|
||||
app.at("/").get(|r| subreddit::community(r).boxed());
|
||||
|
||||
// View Reddit wiki
|
||||
app.at("/w/").get(subreddit::wiki);
|
||||
app.at("/w/:page/").get(subreddit::wiki);
|
||||
app.at("/wiki/").get(subreddit::wiki);
|
||||
app.at("/wiki/:page/").get(subreddit::wiki);
|
||||
app.at("/w").get(|_| async { Ok(redirect("/wiki".to_string())) }.boxed());
|
||||
app
|
||||
.at("/w/*page")
|
||||
.get(|r| async move { Ok(redirect(format!("/wiki/{}", r.param("page").unwrap_or_default()))) }.boxed());
|
||||
app.at("/wiki").get(|r| subreddit::wiki(r).boxed());
|
||||
app.at("/wiki/*page").get(|r| subreddit::wiki(r).boxed());
|
||||
|
||||
// Search all of Reddit
|
||||
app.at("/search/").get(search::find);
|
||||
app.at("/search").get(|r| search::find(r).boxed());
|
||||
|
||||
// Handle about pages
|
||||
app.at("/about/").get(|req| error(req, "About pages aren't here yet".to_string()));
|
||||
app.at("/about").get(|req| error(req, "About pages aren't added yet".to_string()).boxed());
|
||||
|
||||
app.at("/:id/").get(|req: Request<()>| async {
|
||||
match req.param("id") {
|
||||
app.at("/:id").get(|req: Request<Body>| match req.param("id").as_deref() {
|
||||
// Sort front page
|
||||
Ok("best") | Ok("hot") | Ok("new") | Ok("top") | Ok("rising") | Ok("controversial") => subreddit::page(req).await,
|
||||
Some("best") | Some("hot") | Some("new") | Some("top") | Some("rising") | Some("controversial") => subreddit::community(req).boxed(),
|
||||
// Short link for post
|
||||
Ok(id) if id.len() > 4 && id.len() < 7 => post::item(req).await,
|
||||
Some(id) if id.len() > 4 && id.len() < 7 => post::item(req).boxed(),
|
||||
// Error message for unknown pages
|
||||
_ => error(req, "Nothing here".to_string()).await,
|
||||
}
|
||||
_ => error(req, "Nothing here".to_string()).boxed(),
|
||||
});
|
||||
|
||||
// Default service in case no routes match
|
||||
app.at("*").get(|req| error(req, "Nothing here".to_string()));
|
||||
app.at("/*").get(|req| error(req, "Nothing here".to_string()).boxed());
|
||||
|
||||
println!("Running Libreddit v{} on {}!", env!("CARGO_PKG_VERSION"), listener);
|
||||
|
||||
app.listen(&listener).await?;
|
||||
let server = app.listen(listener);
|
||||
|
||||
Ok(())
|
||||
// Run this server for... forever!
|
||||
if let Err(e) = server.await {
|
||||
eprintln!("Server error: {}", e);
|
||||
}
|
||||
}
|
||||
|
36
src/post.rs
36
src/post.rs
@ -1,6 +1,9 @@
|
||||
// CRATES
|
||||
use crate::utils::*;
|
||||
use tide::Request;
|
||||
use crate::client::json;
|
||||
use crate::esc;
|
||||
use crate::server::RequestExt;
|
||||
use crate::utils::{error, format_num, format_url, param, rewrite_urls, setting, template, time, val, Author, Comment, Flags, Flair, FlairPart, Media, Post, Preferences};
|
||||
use hyper::{Body, Request, Response};
|
||||
|
||||
use async_recursion::async_recursion;
|
||||
|
||||
@ -17,36 +20,36 @@ struct PostTemplate {
|
||||
single_thread: bool,
|
||||
}
|
||||
|
||||
pub async fn item(req: Request<()>) -> tide::Result {
|
||||
pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
// Build Reddit API path
|
||||
let mut path: String = format!("{}.json?{}&raw_json=1", req.url().path(), req.url().query().unwrap_or_default());
|
||||
let mut path: String = format!("{}.json?{}&raw_json=1", req.uri().path(), req.uri().query().unwrap_or_default());
|
||||
|
||||
// Set sort to sort query parameter
|
||||
let mut sort: String = param(&path, "sort");
|
||||
|
||||
// Grab default comment sort method from Cookies
|
||||
let default_sort = cookie(&req, "comment_sort");
|
||||
let default_sort = setting(&req, "comment_sort");
|
||||
|
||||
// If there's no sort query but there's a default sort, set sort to default_sort
|
||||
if sort.is_empty() && !default_sort.is_empty() {
|
||||
sort = default_sort;
|
||||
path = format!("{}.json?{}&sort={}&raw_json=1", req.url().path(), req.url().query().unwrap_or_default(), sort);
|
||||
path = format!("{}.json?{}&sort={}&raw_json=1", req.uri().path(), req.uri().query().unwrap_or_default(), sort);
|
||||
}
|
||||
|
||||
// Log the post ID being fetched in debug mode
|
||||
#[cfg(debug_assertions)]
|
||||
dbg!(req.param("id").unwrap_or(""));
|
||||
dbg!(req.param("id").unwrap_or_default());
|
||||
|
||||
let single_thread = &req.param("comment_id").is_ok();
|
||||
let single_thread = req.param("comment_id").is_some();
|
||||
let highlighted_comment = &req.param("comment_id").unwrap_or_default();
|
||||
|
||||
// Send a request to the url, receive JSON in response
|
||||
match request(path).await {
|
||||
match json(path).await {
|
||||
// Otherwise, grab the JSON output from the request
|
||||
Ok(res) => {
|
||||
// Parse the JSON into Post and Comment structs
|
||||
let post = parse_post(&res[0]).await;
|
||||
let comments = parse_comments(&res[1], &post.permalink, &post.author.name, *highlighted_comment).await;
|
||||
let comments = parse_comments(&res[1], &post.permalink, &post.author.name, highlighted_comment).await;
|
||||
|
||||
// Use the Post and Comment structs to generate a website to show users
|
||||
template(PostTemplate {
|
||||
@ -54,7 +57,7 @@ pub async fn item(req: Request<()>) -> tide::Result {
|
||||
post,
|
||||
sort,
|
||||
prefs: Preferences::new(req),
|
||||
single_thread: *single_thread,
|
||||
single_thread,
|
||||
})
|
||||
}
|
||||
// If the Reddit API returns an error, exit and send error page to user
|
||||
@ -79,7 +82,7 @@ async fn parse_post(json: &serde_json::Value) -> Post {
|
||||
// Build a post using data parsed from Reddit post API
|
||||
Post {
|
||||
id: val(post, "id"),
|
||||
title: val(post, "title"),
|
||||
title: esc!(post, "title"),
|
||||
community: val(post, "subreddit"),
|
||||
body: rewrite_urls(&val(post, "selftext_html")).replace("\\", ""),
|
||||
author: Author {
|
||||
@ -90,7 +93,7 @@ async fn parse_post(json: &serde_json::Value) -> Post {
|
||||
post["data"]["author_flair_richtext"].as_array(),
|
||||
post["data"]["author_flair_text"].as_str(),
|
||||
),
|
||||
text: val(post, "link_flair_text"),
|
||||
text: esc!(post, "link_flair_text"),
|
||||
background_color: val(post, "author_flair_background_color"),
|
||||
foreground_color: val(post, "author_flair_text_color"),
|
||||
},
|
||||
@ -103,6 +106,7 @@ async fn parse_post(json: &serde_json::Value) -> Post {
|
||||
media,
|
||||
thumbnail: Media {
|
||||
url: format_url(val(post, "thumbnail").as_str()),
|
||||
alt_url: String::new(),
|
||||
width: post["data"]["thumbnail_width"].as_i64().unwrap_or_default(),
|
||||
height: post["data"]["thumbnail_height"].as_i64().unwrap_or_default(),
|
||||
poster: "".to_string(),
|
||||
@ -113,7 +117,7 @@ async fn parse_post(json: &serde_json::Value) -> Post {
|
||||
post["data"]["link_flair_richtext"].as_array(),
|
||||
post["data"]["link_flair_text"].as_str(),
|
||||
),
|
||||
text: val(post, "link_flair_text"),
|
||||
text: esc!(post, "link_flair_text"),
|
||||
background_color: val(post, "link_flair_background_color"),
|
||||
foreground_color: if val(post, "link_flair_text_color") == "dark" {
|
||||
"black".to_string()
|
||||
@ -189,14 +193,14 @@ async fn parse_comments(json: &serde_json::Value, post_link: &str, post_author:
|
||||
data["author_flair_richtext"].as_array(),
|
||||
data["author_flair_text"].as_str(),
|
||||
),
|
||||
text: val(&comment, "link_flair_text"),
|
||||
text: esc!(&comment, "link_flair_text"),
|
||||
background_color: val(&comment, "author_flair_background_color"),
|
||||
foreground_color: val(&comment, "author_flair_text_color"),
|
||||
},
|
||||
distinguished: val(&comment, "distinguished"),
|
||||
},
|
||||
score: if data["score_hidden"].as_bool().unwrap_or_default() {
|
||||
"•".to_string()
|
||||
("\u{2022}".to_string(), "Hidden".to_string())
|
||||
} else {
|
||||
format_num(score)
|
||||
},
|
||||
|
32
src/proxy.rs
32
src/proxy.rs
@ -1,32 +0,0 @@
|
||||
use surf::Body;
|
||||
use tide::{Request, Response};
|
||||
|
||||
pub async fn handler(req: Request<()>, format: &str, params: Vec<&str>) -> tide::Result {
|
||||
let mut url = format.to_string();
|
||||
|
||||
for name in params {
|
||||
let param = req.param(name).unwrap_or_default();
|
||||
url = url.replacen("{}", param, 1);
|
||||
}
|
||||
|
||||
request(url).await
|
||||
}
|
||||
|
||||
async fn request(url: String) -> tide::Result {
|
||||
match surf::get(url).await {
|
||||
Ok(res) => {
|
||||
let content_length = res.header("Content-Length").map(|v| v.to_string()).unwrap_or_default();
|
||||
let content_type = res.content_type().map(|m| m.to_string()).unwrap_or_default();
|
||||
|
||||
Ok(
|
||||
Response::builder(res.status())
|
||||
.body(Body::from_reader(res, None))
|
||||
.header("Cache-Control", "public, max-age=1209600, s-maxage=86400")
|
||||
.header("Content-Length", content_length)
|
||||
.header("Content-Type", content_type)
|
||||
.build(),
|
||||
)
|
||||
}
|
||||
Err(e) => Ok(Response::builder(503).body(e.to_string()).build()),
|
||||
}
|
||||
}
|
@ -1,7 +1,8 @@
|
||||
// CRATES
|
||||
use crate::utils::{cookie, error, param, request, template, val, Post, Preferences};
|
||||
use crate::utils::{catch_random, error, format_num, format_url, param, setting, template, val, Post, Preferences};
|
||||
use crate::{client::json, RequestExt};
|
||||
use askama::Template;
|
||||
use tide::Request;
|
||||
use hyper::{Body, Request, Response};
|
||||
|
||||
// STRUCTS
|
||||
struct SearchParams {
|
||||
@ -17,8 +18,9 @@ struct SearchParams {
|
||||
struct Subreddit {
|
||||
name: String,
|
||||
url: String,
|
||||
icon: String,
|
||||
description: String,
|
||||
subscribers: i64,
|
||||
subscribers: (String, String),
|
||||
}
|
||||
|
||||
#[derive(Template)]
|
||||
@ -29,13 +31,18 @@ struct SearchTemplate {
|
||||
sub: String,
|
||||
params: SearchParams,
|
||||
prefs: Preferences,
|
||||
url: String,
|
||||
}
|
||||
|
||||
// SERVICES
|
||||
pub async fn find(req: Request<()>) -> tide::Result {
|
||||
let nsfw_results = if cookie(&req, "show_nsfw") == "on" { "&include_over_18=on" } else { "" };
|
||||
let path = format!("{}.json?{}{}", req.url().path(), req.url().query().unwrap_or_default(), nsfw_results);
|
||||
let sub = req.param("sub").unwrap_or("").to_string();
|
||||
pub async fn find(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
let nsfw_results = if setting(&req, "show_nsfw") == "on" { "&include_over_18=on" } else { "" };
|
||||
let path = format!("{}.json?{}{}", req.uri().path(), req.uri().query().unwrap_or_default(), nsfw_results);
|
||||
let sub = req.param("sub").unwrap_or_default();
|
||||
// Handle random subreddits
|
||||
if let Ok(random) = catch_random(&sub, "/find").await {
|
||||
return Ok(random);
|
||||
}
|
||||
let query = param(&path, "q");
|
||||
|
||||
let sort = if param(&path, "sort").is_empty() {
|
||||
@ -50,6 +57,8 @@ pub async fn find(req: Request<()>) -> tide::Result {
|
||||
Vec::new()
|
||||
};
|
||||
|
||||
let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str()));
|
||||
|
||||
match Post::fetch(&path, String::new()).await {
|
||||
Ok((posts, after)) => template(SearchTemplate {
|
||||
posts,
|
||||
@ -64,6 +73,7 @@ pub async fn find(req: Request<()>) -> tide::Result {
|
||||
restrict_sr: param(&path, "restrict_sr"),
|
||||
},
|
||||
prefs: Preferences::new(req),
|
||||
url,
|
||||
}),
|
||||
Err(msg) => error(req, msg).await,
|
||||
}
|
||||
@ -73,18 +83,29 @@ async fn search_subreddits(q: &str) -> Vec<Subreddit> {
|
||||
let subreddit_search_path = format!("/subreddits/search.json?q={}&limit=3", q.replace(' ', "+"));
|
||||
|
||||
// Send a request to the url
|
||||
match request(subreddit_search_path).await {
|
||||
match json(subreddit_search_path).await {
|
||||
// If success, receive JSON in response
|
||||
Ok(response) => {
|
||||
match response["data"]["children"].as_array() {
|
||||
// For each subreddit from subreddit list
|
||||
Some(list) => list
|
||||
.iter()
|
||||
.map(|subreddit| Subreddit {
|
||||
.map(|subreddit| {
|
||||
// Fetch subreddit icon either from the community_icon or icon_img value
|
||||
let community_icon: &str = subreddit["data"]["community_icon"].as_str().map_or("", |s| s.split('?').collect::<Vec<&str>>()[0]);
|
||||
let icon = if community_icon.is_empty() {
|
||||
val(&subreddit, "icon_img")
|
||||
} else {
|
||||
community_icon.to_string()
|
||||
};
|
||||
|
||||
Subreddit {
|
||||
name: val(subreddit, "display_name_prefixed"),
|
||||
url: val(subreddit, "url"),
|
||||
icon: format_url(&icon),
|
||||
description: val(subreddit, "public_description"),
|
||||
subscribers: subreddit["data"]["subscribers"].as_u64().unwrap_or_default() as i64,
|
||||
subscribers: format_num(subreddit["data"]["subscribers"].as_f64().unwrap_or_default() as i64),
|
||||
}
|
||||
})
|
||||
.collect::<Vec<Subreddit>>(),
|
||||
_ => Vec::new(),
|
||||
|
214
src/server.rs
Normal file
214
src/server.rs
Normal file
@ -0,0 +1,214 @@
|
||||
use cookie::Cookie;
|
||||
use futures_lite::{future::Boxed, Future, FutureExt};
|
||||
use hyper::{
|
||||
header::HeaderValue,
|
||||
service::{make_service_fn, service_fn},
|
||||
HeaderMap,
|
||||
};
|
||||
use hyper::{Body, Method, Request, Response, Server as HyperServer};
|
||||
use route_recognizer::{Params, Router};
|
||||
use std::{pin::Pin, result::Result};
|
||||
use time::Duration;
|
||||
|
||||
type BoxResponse = Pin<Box<dyn Future<Output = Result<Response<Body>, String>> + Send>>;
|
||||
|
||||
pub struct Route<'a> {
|
||||
router: &'a mut Router<fn(Request<Body>) -> BoxResponse>,
|
||||
path: String,
|
||||
}
|
||||
|
||||
pub struct Server {
|
||||
pub default_headers: HeaderMap,
|
||||
router: Router<fn(Request<Body>) -> BoxResponse>,
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! headers(
|
||||
{ $($key:expr => $value:expr),+ } => {
|
||||
{
|
||||
let mut m = hyper::HeaderMap::new();
|
||||
$(
|
||||
if let Ok(val) = hyper::header::HeaderValue::from_str($value) {
|
||||
m.insert($key, val);
|
||||
}
|
||||
)+
|
||||
m
|
||||
}
|
||||
};
|
||||
);
|
||||
|
||||
pub trait RequestExt {
|
||||
fn params(&self) -> Params;
|
||||
fn param(&self, name: &str) -> Option<String>;
|
||||
fn set_params(&mut self, params: Params) -> Option<Params>;
|
||||
fn cookies(&self) -> Vec<Cookie>;
|
||||
fn cookie(&self, name: &str) -> Option<Cookie>;
|
||||
}
|
||||
|
||||
pub trait ResponseExt {
|
||||
fn cookies(&self) -> Vec<Cookie>;
|
||||
fn insert_cookie(&mut self, cookie: Cookie);
|
||||
fn remove_cookie(&mut self, name: String);
|
||||
}
|
||||
|
||||
impl RequestExt for Request<Body> {
|
||||
fn params(&self) -> Params {
|
||||
self.extensions().get::<Params>().unwrap_or(&Params::new()).to_owned()
|
||||
// self.extensions()
|
||||
// .get::<RequestMeta>()
|
||||
// .and_then(|meta| meta.route_params())
|
||||
// .expect("Routerify: No RouteParams added while processing request")
|
||||
}
|
||||
|
||||
fn param(&self, name: &str) -> Option<String> {
|
||||
self.params().find(name).map(std::borrow::ToOwned::to_owned)
|
||||
}
|
||||
|
||||
fn set_params(&mut self, params: Params) -> Option<Params> {
|
||||
self.extensions_mut().insert(params)
|
||||
}
|
||||
|
||||
fn cookies(&self) -> Vec<Cookie> {
|
||||
let mut cookies = Vec::new();
|
||||
if let Some(header) = self.headers().get("Cookie") {
|
||||
for cookie in header.to_str().unwrap_or_default().split("; ") {
|
||||
cookies.push(Cookie::parse(cookie).unwrap_or_else(|_| Cookie::named("")));
|
||||
}
|
||||
}
|
||||
cookies
|
||||
}
|
||||
|
||||
fn cookie(&self, name: &str) -> Option<Cookie> {
|
||||
self.cookies().iter().find(|c| c.name() == name).map(std::borrow::ToOwned::to_owned)
|
||||
}
|
||||
}
|
||||
|
||||
impl ResponseExt for Response<Body> {
|
||||
fn cookies(&self) -> Vec<Cookie> {
|
||||
let mut cookies = Vec::new();
|
||||
for header in self.headers().get_all("Cookie") {
|
||||
if let Ok(cookie) = Cookie::parse(header.to_str().unwrap_or_default()) {
|
||||
cookies.push(cookie);
|
||||
}
|
||||
}
|
||||
cookies
|
||||
}
|
||||
|
||||
fn insert_cookie(&mut self, cookie: Cookie) {
|
||||
if let Ok(val) = HeaderValue::from_str(&cookie.to_string()) {
|
||||
self.headers_mut().append("Set-Cookie", val);
|
||||
}
|
||||
}
|
||||
|
||||
fn remove_cookie(&mut self, name: String) {
|
||||
let mut cookie = Cookie::named(name);
|
||||
cookie.set_path("/");
|
||||
cookie.set_max_age(Duration::second());
|
||||
if let Ok(val) = HeaderValue::from_str(&cookie.to_string()) {
|
||||
self.headers_mut().append("Set-Cookie", val);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Route<'_> {
|
||||
fn method(&mut self, method: Method, dest: fn(Request<Body>) -> BoxResponse) -> &mut Self {
|
||||
self.router.add(&format!("/{}{}", method.as_str(), self.path), dest);
|
||||
self
|
||||
}
|
||||
|
||||
/// Add an endpoint for `GET` requests
|
||||
pub fn get(&mut self, dest: fn(Request<Body>) -> BoxResponse) -> &mut Self {
|
||||
self.method(Method::GET, dest)
|
||||
}
|
||||
|
||||
/// Add an endpoint for `POST` requests
|
||||
pub fn post(&mut self, dest: fn(Request<Body>) -> BoxResponse) -> &mut Self {
|
||||
self.method(Method::POST, dest)
|
||||
}
|
||||
}
|
||||
|
||||
impl Server {
|
||||
pub fn new() -> Self {
|
||||
Server {
|
||||
default_headers: HeaderMap::new(),
|
||||
router: Router::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn at(&mut self, path: &str) -> Route {
|
||||
Route {
|
||||
path: path.to_owned(),
|
||||
router: &mut self.router,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn listen(self, addr: String) -> Boxed<Result<(), hyper::Error>> {
|
||||
let make_svc = make_service_fn(move |_conn| {
|
||||
let router = self.router.clone();
|
||||
let default_headers = self.default_headers.clone();
|
||||
|
||||
// This is the `Service` that will handle the connection.
|
||||
// `service_fn` is a helper to convert a function that
|
||||
// returns a Response into a `Service`.
|
||||
// let shared_router = router.clone();
|
||||
async move {
|
||||
Ok::<_, String>(service_fn(move |req: Request<Body>| {
|
||||
let headers = default_headers.clone();
|
||||
|
||||
// Remove double slashes
|
||||
let mut path = req.uri().path().replace("//", "/");
|
||||
|
||||
// Remove trailing slashes
|
||||
if path.ends_with('/') && path != "/" {
|
||||
path.pop();
|
||||
}
|
||||
|
||||
// Match the visited path with an added route
|
||||
match router.recognize(&format!("/{}{}", req.method().as_str(), path)) {
|
||||
// If a route was configured for this path
|
||||
Ok(found) => {
|
||||
let mut parammed = req;
|
||||
parammed.set_params(found.params().to_owned());
|
||||
|
||||
// Run the route's function
|
||||
let func = (found.handler().to_owned().to_owned())(parammed);
|
||||
async move {
|
||||
let res: Result<Response<Body>, String> = func.await;
|
||||
// Add default headers to response
|
||||
res.map(|mut response| {
|
||||
response.headers_mut().extend(headers);
|
||||
response
|
||||
})
|
||||
}
|
||||
.boxed()
|
||||
}
|
||||
// If there was a routing error
|
||||
Err(e) => async move {
|
||||
// Return a 404 error
|
||||
let res: Result<Response<Body>, String> = Ok(Response::builder().status(404).body(e.into()).unwrap_or_default());
|
||||
// Add default headers to response
|
||||
res.map(|mut response| {
|
||||
response.headers_mut().extend(headers);
|
||||
response
|
||||
})
|
||||
}
|
||||
.boxed(),
|
||||
}
|
||||
}))
|
||||
}
|
||||
});
|
||||
|
||||
let address = &addr.parse().unwrap_or_else(|_| panic!("Cannot parse {} as address (example format: 0.0.0.0:8080)", addr));
|
||||
|
||||
let server = HyperServer::bind(address).serve(make_svc);
|
||||
|
||||
let graceful = server.with_graceful_shutdown(shutdown_signal());
|
||||
|
||||
graceful.boxed()
|
||||
}
|
||||
}
|
||||
|
||||
async fn shutdown_signal() {
|
||||
// Wait for the CTRL+C signal
|
||||
tokio::signal::ctrl_c().await.expect("Failed to install CTRL+C signal handler");
|
||||
}
|
119
src/settings.rs
119
src/settings.rs
@ -1,7 +1,12 @@
|
||||
use std::collections::HashMap;
|
||||
|
||||
// CRATES
|
||||
use crate::server::ResponseExt;
|
||||
use crate::utils::{redirect, template, Preferences};
|
||||
use askama::Template;
|
||||
use tide::{http::Cookie, Request};
|
||||
use cookie::Cookie;
|
||||
use futures_lite::StreamExt;
|
||||
use hyper::{Body, Request, Response};
|
||||
use time::{Duration, OffsetDateTime};
|
||||
|
||||
// STRUCTS
|
||||
@ -11,77 +16,119 @@ struct SettingsTemplate {
|
||||
prefs: Preferences,
|
||||
}
|
||||
|
||||
#[derive(serde::Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct SettingsForm {
|
||||
theme: Option<String>,
|
||||
front_page: Option<String>,
|
||||
layout: Option<String>,
|
||||
wide: Option<String>,
|
||||
comment_sort: Option<String>,
|
||||
show_nsfw: Option<String>,
|
||||
redirect: Option<String>,
|
||||
subscriptions: Option<String>,
|
||||
}
|
||||
// CONSTANTS
|
||||
|
||||
const PREFS: [&str; 10] = [
|
||||
"theme",
|
||||
"front_page",
|
||||
"layout",
|
||||
"wide",
|
||||
"comment_sort",
|
||||
"post_sort",
|
||||
"show_nsfw",
|
||||
"use_hls",
|
||||
"hide_hls_notification",
|
||||
"subscriptions",
|
||||
];
|
||||
|
||||
// FUNCTIONS
|
||||
|
||||
// Retrieve cookies from request "Cookie" header
|
||||
pub async fn get(req: Request<()>) -> tide::Result {
|
||||
pub async fn get(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
template(SettingsTemplate { prefs: Preferences::new(req) })
|
||||
}
|
||||
|
||||
// Set cookies using response "Set-Cookie" header
|
||||
pub async fn set(mut req: Request<()>) -> tide::Result {
|
||||
let form: SettingsForm = req.body_form().await.unwrap_or_default();
|
||||
pub async fn set(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
// Split the body into parts
|
||||
let (parts, mut body) = req.into_parts();
|
||||
|
||||
// Grab existing cookies
|
||||
let mut cookies = Vec::new();
|
||||
for header in parts.headers.get_all("Cookie") {
|
||||
if let Ok(cookie) = Cookie::parse(header.to_str().unwrap_or_default()) {
|
||||
cookies.push(cookie);
|
||||
}
|
||||
}
|
||||
|
||||
// Aggregate the body...
|
||||
// let whole_body = hyper::body::aggregate(req).await.map_err(|e| e.to_string())?;
|
||||
let body_bytes = body
|
||||
.try_fold(Vec::new(), |mut data, chunk| {
|
||||
data.extend_from_slice(&chunk);
|
||||
Ok(data)
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
let form = url::form_urlencoded::parse(&body_bytes).collect::<HashMap<_, _>>();
|
||||
|
||||
let mut res = redirect("/settings".to_string());
|
||||
|
||||
let names = vec!["theme", "front_page", "layout", "wide", "comment_sort", "show_nsfw"];
|
||||
let values = vec![form.theme, form.front_page, form.layout, form.wide, form.comment_sort, form.show_nsfw];
|
||||
|
||||
for (i, name) in names.iter().enumerate() {
|
||||
match values.get(i) {
|
||||
for &name in &PREFS {
|
||||
match form.get(name) {
|
||||
Some(value) => res.insert_cookie(
|
||||
Cookie::build(name.to_owned(), value.to_owned().unwrap_or_default())
|
||||
Cookie::build(name.to_owned(), value.to_owned())
|
||||
.path("/")
|
||||
.http_only(true)
|
||||
.expires(OffsetDateTime::now_utc() + Duration::weeks(52))
|
||||
.finish(),
|
||||
),
|
||||
None => res.remove_cookie(Cookie::named(name.to_owned())),
|
||||
None => res.remove_cookie(name.to_string()),
|
||||
};
|
||||
}
|
||||
|
||||
Ok(res)
|
||||
}
|
||||
|
||||
// Set cookies using response "Set-Cookie" header
|
||||
pub async fn restore(req: Request<()>) -> tide::Result {
|
||||
let form: SettingsForm = req.query()?;
|
||||
fn set_cookies_method(req: Request<Body>, remove_cookies: bool) -> Response<Body> {
|
||||
// Split the body into parts
|
||||
let (parts, _) = req.into_parts();
|
||||
|
||||
let path = match form.redirect {
|
||||
Some(value) => format!("/{}/", value),
|
||||
// Grab existing cookies
|
||||
let mut cookies = Vec::new();
|
||||
for header in parts.headers.get_all("Cookie") {
|
||||
if let Ok(cookie) = Cookie::parse(header.to_str().unwrap_or_default()) {
|
||||
cookies.push(cookie);
|
||||
}
|
||||
}
|
||||
|
||||
let query = parts.uri.query().unwrap_or_default().as_bytes();
|
||||
|
||||
let form = url::form_urlencoded::parse(query).collect::<HashMap<_, _>>();
|
||||
|
||||
let path = match form.get("redirect") {
|
||||
Some(value) => format!("/{}", value.replace("%26", "&").replace("%23", "#")),
|
||||
None => "/".to_string(),
|
||||
};
|
||||
|
||||
let mut res = redirect(path);
|
||||
|
||||
let names = vec!["theme", "front_page", "layout", "wide", "comment_sort", "show_nsfw", "subscriptions"];
|
||||
let values = vec![form.theme, form.front_page, form.layout, form.wide, form.comment_sort, form.show_nsfw, form.subscriptions];
|
||||
|
||||
for (i, name) in names.iter().enumerate() {
|
||||
match values.get(i) {
|
||||
for &name in &PREFS {
|
||||
match form.get(name) {
|
||||
Some(value) => res.insert_cookie(
|
||||
Cookie::build(name.to_owned(), value.to_owned().unwrap_or_default())
|
||||
Cookie::build(name.to_owned(), value.to_owned())
|
||||
.path("/")
|
||||
.http_only(true)
|
||||
.expires(OffsetDateTime::now_utc() + Duration::weeks(52))
|
||||
.finish(),
|
||||
),
|
||||
None => res.remove_cookie(Cookie::named(name.to_owned())),
|
||||
None => {
|
||||
if remove_cookies {
|
||||
res.remove_cookie(name.to_string())
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
Ok(res)
|
||||
res
|
||||
}
|
||||
|
||||
// Set cookies using response "Set-Cookie" header
|
||||
pub async fn restore(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
Ok(set_cookies_method(req, true))
|
||||
}
|
||||
|
||||
pub async fn update(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
Ok(set_cookies_method(req, false))
|
||||
}
|
||||
|
171
src/subreddit.rs
171
src/subreddit.rs
@ -1,7 +1,10 @@
|
||||
// CRATES
|
||||
use crate::utils::*;
|
||||
use crate::esc;
|
||||
use crate::utils::{catch_random, error, format_num, format_url, param, redirect, rewrite_urls, setting, template, val, Post, Preferences, Subreddit};
|
||||
use crate::{client::json, server::ResponseExt, RequestExt};
|
||||
use askama::Template;
|
||||
use tide::{http::Cookie, Request};
|
||||
use cookie::Cookie;
|
||||
use hyper::{Body, Request, Response};
|
||||
use time::{Duration, OffsetDateTime};
|
||||
|
||||
// STRUCTS
|
||||
@ -13,6 +16,7 @@ struct SubredditTemplate {
|
||||
sort: (String, String),
|
||||
ends: (String, String),
|
||||
prefs: Preferences,
|
||||
url: String,
|
||||
}
|
||||
|
||||
#[derive(Template)]
|
||||
@ -25,13 +29,14 @@ struct WikiTemplate {
|
||||
}
|
||||
|
||||
// SERVICES
|
||||
pub async fn page(req: Request<()>) -> tide::Result {
|
||||
pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
// Build Reddit API path
|
||||
let subscribed = cookie(&req, "subscriptions");
|
||||
let front_page = cookie(&req, "front_page");
|
||||
let sort = req.param("sort").unwrap_or_else(|_| req.param("id").unwrap_or("hot")).to_string();
|
||||
let subscribed = setting(&req, "subscriptions");
|
||||
let front_page = setting(&req, "front_page");
|
||||
let post_sort = req.cookie("post_sort").map_or_else(|| "hot".to_string(), |c| c.value().to_string());
|
||||
let sort = req.param("sort").unwrap_or_else(|| req.param("id").unwrap_or(post_sort));
|
||||
|
||||
let sub = req.param("sub").map(String::from).unwrap_or(if front_page == "default" || front_page.is_empty() {
|
||||
let sub = req.param("sub").unwrap_or(if front_page == "default" || front_page.is_empty() {
|
||||
if subscribed.is_empty() {
|
||||
"popular".to_string()
|
||||
} else {
|
||||
@ -41,7 +46,16 @@ pub async fn page(req: Request<()>) -> tide::Result {
|
||||
front_page.to_owned()
|
||||
});
|
||||
|
||||
let path = format!("/r/{}/{}.json?{}&raw_json=1", sub, sort, req.url().query().unwrap_or_default());
|
||||
// Handle random subreddits
|
||||
if let Ok(random) = catch_random(&sub, "").await {
|
||||
return Ok(random);
|
||||
}
|
||||
|
||||
if req.param("sub").is_some() && sub.starts_with("u_") {
|
||||
return Ok(redirect(["/user/", &sub[2..]].concat()));
|
||||
}
|
||||
|
||||
let path = format!("/r/{}/{}.json?{}&raw_json=1", sub, sort, req.uri().query().unwrap_or_default());
|
||||
|
||||
match Post::fetch(&path, String::new()).await {
|
||||
Ok((posts, after)) => {
|
||||
@ -51,7 +65,7 @@ pub async fn page(req: Request<()>) -> tide::Result {
|
||||
subreddit(&sub).await.unwrap_or_default()
|
||||
} else if sub == subscribed {
|
||||
// Subscription feed
|
||||
if req.url().path().starts_with("/r/") {
|
||||
if req.uri().path().starts_with("/r/") {
|
||||
subreddit(&sub).await.unwrap_or_default()
|
||||
} else {
|
||||
Subreddit::default()
|
||||
@ -66,12 +80,15 @@ pub async fn page(req: Request<()>) -> tide::Result {
|
||||
Subreddit::default()
|
||||
};
|
||||
|
||||
let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str()));
|
||||
|
||||
template(SubredditTemplate {
|
||||
sub,
|
||||
posts,
|
||||
sort: (sort, param(&path, "t")),
|
||||
ends: (param(&path, "after"), after),
|
||||
prefs: Preferences::new(req),
|
||||
url,
|
||||
})
|
||||
}
|
||||
Err(msg) => match msg.as_str() {
|
||||
@ -84,15 +101,44 @@ pub async fn page(req: Request<()>) -> tide::Result {
|
||||
}
|
||||
|
||||
// Sub or unsub by setting subscription cookie using response "Set-Cookie" header
|
||||
pub async fn subscriptions(req: Request<()>) -> tide::Result {
|
||||
let sub = req.param("sub").unwrap_or_default().to_string();
|
||||
let query = req.url().query().unwrap_or_default().to_string();
|
||||
let action: Vec<String> = req.url().path().split('/').map(String::from).collect();
|
||||
pub async fn subscriptions(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
let sub = req.param("sub").unwrap_or_default();
|
||||
// Handle random subreddits
|
||||
if sub == "random" || sub == "randnsfw" {
|
||||
return Err("Can't subscribe to random subreddit!".to_string());
|
||||
}
|
||||
|
||||
let query = req.uri().query().unwrap_or_default().to_string();
|
||||
let action: Vec<String> = req.uri().path().split('/').map(String::from).collect();
|
||||
|
||||
let mut sub_list = Preferences::new(req).subscriptions;
|
||||
|
||||
// Retrieve list of posts for these subreddits to extract display names
|
||||
let display = json(format!("/r/{}/hot.json?raw_json=1", sub)).await?;
|
||||
let display_lookup: Vec<(String, &str)> = display["data"]["children"]
|
||||
.as_array()
|
||||
.unwrap()
|
||||
.iter()
|
||||
.map(|post| {
|
||||
let display_name = post["data"]["subreddit"].as_str().unwrap();
|
||||
(display_name.to_lowercase(), display_name)
|
||||
})
|
||||
.collect();
|
||||
|
||||
// Find each subreddit name (separated by '+') in sub parameter
|
||||
for part in sub.split('+') {
|
||||
// Retrieve display name for the subreddit
|
||||
let display;
|
||||
let part = if let Some(&(_, display)) = display_lookup.iter().find(|x| x.0 == part.to_lowercase()) {
|
||||
// This is already known, doesn't require seperate request
|
||||
display
|
||||
} else {
|
||||
// This subreddit display name isn't known, retrieve it
|
||||
let path: String = format!("/r/{}/about.json?raw_json=1", part);
|
||||
display = json(path).await?;
|
||||
display["data"]["display_name"].as_str().ok_or_else(|| "Failed to query subreddit name".to_string())?
|
||||
};
|
||||
|
||||
// Modify sub list based on action
|
||||
if action.contains(&"subscribe".to_string()) && !sub_list.contains(&part.to_owned()) {
|
||||
// Add each sub name to the subscribed list
|
||||
@ -108,18 +154,17 @@ pub async fn subscriptions(req: Request<()>) -> tide::Result {
|
||||
// Redirect back to subreddit
|
||||
// check for redirect parameter if unsubscribing from outside sidebar
|
||||
let redirect_path = param(&format!("/?{}", query), "redirect");
|
||||
let path = if !redirect_path.is_empty() {
|
||||
format!("/{}/", redirect_path)
|
||||
} else {
|
||||
let path = if redirect_path.is_empty() {
|
||||
format!("/r/{}", sub)
|
||||
} else {
|
||||
format!("/{}/", redirect_path)
|
||||
};
|
||||
|
||||
let mut res = redirect(path);
|
||||
|
||||
// Delete cookie if empty, else set
|
||||
if sub_list.is_empty() {
|
||||
// res.del_cookie(&Cookie::build("subscriptions", "").path("/").finish());
|
||||
res.remove_cookie(Cookie::build("subscriptions", "").path("/").finish());
|
||||
res.remove_cookie("subscriptions".to_string());
|
||||
} else {
|
||||
res.insert_cookie(
|
||||
Cookie::build("subscriptions", sub_list.join("+"))
|
||||
@ -133,15 +178,20 @@ pub async fn subscriptions(req: Request<()>) -> tide::Result {
|
||||
Ok(res)
|
||||
}
|
||||
|
||||
pub async fn wiki(req: Request<()>) -> tide::Result {
|
||||
let sub = req.param("sub").unwrap_or("reddit.com").to_string();
|
||||
let page = req.param("page").unwrap_or("index").to_string();
|
||||
pub async fn wiki(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
let sub = req.param("sub").unwrap_or_else(|| "reddit.com".to_string());
|
||||
// Handle random subreddits
|
||||
if let Ok(random) = catch_random(&sub, "/wiki").await {
|
||||
return Ok(random);
|
||||
}
|
||||
|
||||
let page = req.param("page").unwrap_or_else(|| "index".to_string());
|
||||
let path: String = format!("/r/{}/wiki/{}.json?raw_json=1", sub, page);
|
||||
|
||||
match request(path).await {
|
||||
Ok(res) => template(WikiTemplate {
|
||||
match json(path).await {
|
||||
Ok(response) => template(WikiTemplate {
|
||||
sub,
|
||||
wiki: rewrite_urls(res["data"]["content_html"].as_str().unwrap_or_default()),
|
||||
wiki: rewrite_urls(response["data"]["content_html"].as_str().unwrap_or("<h3>Wiki not found</h3>")),
|
||||
page,
|
||||
prefs: Preferences::new(req),
|
||||
}),
|
||||
@ -149,13 +199,75 @@ pub async fn wiki(req: Request<()>) -> tide::Result {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn sidebar(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
let sub = req.param("sub").unwrap_or_else(|| "reddit.com".to_string());
|
||||
// Handle random subreddits
|
||||
if let Ok(random) = catch_random(&sub, "/about/sidebar").await {
|
||||
return Ok(random);
|
||||
}
|
||||
|
||||
// Build the Reddit JSON API url
|
||||
let path: String = format!("/r/{}/about.json?raw_json=1", sub);
|
||||
|
||||
// Send a request to the url
|
||||
match json(path).await {
|
||||
// If success, receive JSON in response
|
||||
Ok(response) => template(WikiTemplate {
|
||||
wiki: format!(
|
||||
"{}<hr><h1>Moderators</h1><br><ul>{}</ul>",
|
||||
rewrite_urls(&val(&response, "description_html").replace("\\", "")),
|
||||
moderators(&sub).await?.join(""),
|
||||
),
|
||||
sub,
|
||||
page: "Sidebar".to_string(),
|
||||
prefs: Preferences::new(req),
|
||||
}),
|
||||
Err(msg) => error(req, msg).await,
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn moderators(sub: &str) -> Result<Vec<String>, String> {
|
||||
// Retrieve and format the html for the moderators list
|
||||
Ok(
|
||||
moderators_list(sub)
|
||||
.await?
|
||||
.iter()
|
||||
.map(|m| format!("<li><a style=\"color: var(--accent)\" href=\"/u/{name}\">{name}</a></li>", name = m))
|
||||
.collect(),
|
||||
)
|
||||
}
|
||||
|
||||
async fn moderators_list(sub: &str) -> Result<Vec<String>, String> {
|
||||
// Build the moderator list URL
|
||||
let path: String = format!("/r/{}/about/moderators.json?raw_json=1", sub);
|
||||
|
||||
// Retrieve response
|
||||
let response = json(path).await?["data"]["children"].clone();
|
||||
Ok(
|
||||
// Traverse json tree and format into list of strings
|
||||
response
|
||||
.as_array()
|
||||
.unwrap_or(&Vec::new())
|
||||
.iter()
|
||||
.filter_map(|moderator| {
|
||||
let name = moderator["name"].as_str().unwrap_or_default();
|
||||
if name.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(name.to_string())
|
||||
}
|
||||
})
|
||||
.collect::<Vec<_>>(),
|
||||
)
|
||||
}
|
||||
|
||||
// SUBREDDIT
|
||||
async fn subreddit(sub: &str) -> Result<Subreddit, String> {
|
||||
// Build the Reddit JSON API url
|
||||
let path: String = format!("/r/{}/about.json?raw_json=1", sub);
|
||||
|
||||
// Send a request to the url
|
||||
match request(path).await {
|
||||
match json(path).await {
|
||||
// If success, receive JSON in response
|
||||
Ok(res) => {
|
||||
// Metadata regarding the subreddit
|
||||
@ -163,14 +275,15 @@ async fn subreddit(sub: &str) -> Result<Subreddit, String> {
|
||||
let active: i64 = res["data"]["accounts_active"].as_u64().unwrap_or_default() as i64;
|
||||
|
||||
// Fetch subreddit icon either from the community_icon or icon_img value
|
||||
let community_icon: &str = res["data"]["community_icon"].as_str().map_or("", |s| s.split('?').collect::<Vec<&str>>()[0]);
|
||||
let community_icon: &str = res["data"]["community_icon"].as_str().unwrap();
|
||||
let icon = if community_icon.is_empty() { val(&res, "icon_img") } else { community_icon.to_string() };
|
||||
|
||||
let sub = Subreddit {
|
||||
name: val(&res, "display_name"),
|
||||
title: val(&res, "title"),
|
||||
description: val(&res, "public_description"),
|
||||
name: esc!(&res, "display_name"),
|
||||
title: esc!(&res, "title"),
|
||||
description: esc!(&res, "public_description"),
|
||||
info: rewrite_urls(&val(&res, "description_html").replace("\\", "")),
|
||||
moderators: moderators_list(sub).await?,
|
||||
icon: format_url(&icon),
|
||||
members: format_num(members),
|
||||
active: format_num(active),
|
||||
|
28
src/user.rs
28
src/user.rs
@ -1,7 +1,10 @@
|
||||
// CRATES
|
||||
use crate::utils::*;
|
||||
use crate::client::json;
|
||||
use crate::esc;
|
||||
use crate::server::RequestExt;
|
||||
use crate::utils::{error, format_url, param, template, Post, Preferences, User};
|
||||
use askama::Template;
|
||||
use tide::Request;
|
||||
use hyper::{Body, Request, Response};
|
||||
use time::OffsetDateTime;
|
||||
|
||||
// STRUCTS
|
||||
@ -13,19 +16,25 @@ struct UserTemplate {
|
||||
sort: (String, String),
|
||||
ends: (String, String),
|
||||
prefs: Preferences,
|
||||
url: String,
|
||||
}
|
||||
|
||||
// FUNCTIONS
|
||||
pub async fn profile(req: Request<()>) -> tide::Result {
|
||||
pub async fn profile(req: Request<Body>) -> Result<Response<Body>, String> {
|
||||
// Build the Reddit JSON API path
|
||||
let path = format!("{}.json?{}&raw_json=1", req.url().path(), req.url().query().unwrap_or_default());
|
||||
let path = format!(
|
||||
"/user/{}.json?{}&raw_json=1",
|
||||
req.param("name").unwrap_or_else(|| "reddit".to_string()),
|
||||
req.uri().query().unwrap_or_default()
|
||||
);
|
||||
|
||||
// Retrieve other variables from Libreddit request
|
||||
let sort = param(&path, "sort");
|
||||
let username = req.param("name").unwrap_or("").to_string();
|
||||
let username = req.param("name").unwrap_or_default();
|
||||
|
||||
// Request user posts/comments from Reddit
|
||||
let posts = Post::fetch(&path, "Comment".to_string()).await;
|
||||
let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str()));
|
||||
|
||||
match posts {
|
||||
Ok((posts, after)) => {
|
||||
@ -38,6 +47,7 @@ pub async fn profile(req: Request<()>) -> tide::Result {
|
||||
sort: (sort, param(&path, "t")),
|
||||
ends: (param(&path, "after"), after),
|
||||
prefs: Preferences::new(req),
|
||||
url,
|
||||
})
|
||||
}
|
||||
// If there is an error show error page
|
||||
@ -51,23 +61,23 @@ async fn user(name: &str) -> Result<User, String> {
|
||||
let path: String = format!("/user/{}/about.json?raw_json=1", name);
|
||||
|
||||
// Send a request to the url
|
||||
match request(path).await {
|
||||
match json(path).await {
|
||||
// If success, receive JSON in response
|
||||
Ok(res) => {
|
||||
// Grab creation date as unix timestamp
|
||||
let created: i64 = res["data"]["created"].as_f64().unwrap_or(0.0).round() as i64;
|
||||
|
||||
// nested_val function used to parse JSON from Reddit APIs
|
||||
// Closure used to parse JSON from Reddit APIs
|
||||
let about = |item| res["data"]["subreddit"][item].as_str().unwrap_or_default().to_string();
|
||||
|
||||
// Parse the JSON output into a User struct
|
||||
Ok(User {
|
||||
name: name.to_string(),
|
||||
title: about("title"),
|
||||
title: esc!(about("title")),
|
||||
icon: format_url(&about("icon_img")),
|
||||
karma: res["data"]["total_karma"].as_i64().unwrap_or(0),
|
||||
created: OffsetDateTime::from_unix_timestamp(created).format("%b %d '%y"),
|
||||
banner: about("banner_img"),
|
||||
banner: esc!(about("banner_img")),
|
||||
description: about("public_description"),
|
||||
})
|
||||
}
|
||||
|
290
src/utils.rs
290
src/utils.rs
@ -1,13 +1,15 @@
|
||||
//
|
||||
// CRATES
|
||||
//
|
||||
use crate::{client::json, esc, server::RequestExt};
|
||||
use askama::Template;
|
||||
use cached::proc_macro::cached;
|
||||
use cookie::Cookie;
|
||||
use hyper::{Body, Request, Response};
|
||||
use regex::Regex;
|
||||
use serde_json::{from_str, Error, Value};
|
||||
use serde_json::Value;
|
||||
use std::collections::HashMap;
|
||||
use tide::{http::url::Url, http::Cookie, Request, Response};
|
||||
use time::{Duration, OffsetDateTime};
|
||||
use url::Url;
|
||||
|
||||
// Post flair with content, background color and foreground color
|
||||
pub struct Flair {
|
||||
@ -73,6 +75,7 @@ pub struct Flags {
|
||||
|
||||
pub struct Media {
|
||||
pub url: String,
|
||||
pub alt_url: String,
|
||||
pub width: i64,
|
||||
pub height: i64,
|
||||
pub poster: String,
|
||||
@ -83,12 +86,28 @@ impl Media {
|
||||
let mut gallery = Vec::new();
|
||||
|
||||
// If post is a video, return the video
|
||||
let (post_type, url_val) = if data["preview"]["reddit_video_preview"]["fallback_url"].is_string() {
|
||||
let (post_type, url_val, alt_url_val) = if data["preview"]["reddit_video_preview"]["fallback_url"].is_string() {
|
||||
// Return reddit video
|
||||
("video", &data["preview"]["reddit_video_preview"]["fallback_url"])
|
||||
(
|
||||
if data["preview"]["reddit_video_preview"]["is_gif"].as_bool().unwrap_or(false) {
|
||||
"gif"
|
||||
} else {
|
||||
"video"
|
||||
},
|
||||
&data["preview"]["reddit_video_preview"]["fallback_url"],
|
||||
Some(&data["preview"]["reddit_video_preview"]["hls_url"]),
|
||||
)
|
||||
} else if data["secure_media"]["reddit_video"]["fallback_url"].is_string() {
|
||||
// Return reddit video
|
||||
("video", &data["secure_media"]["reddit_video"]["fallback_url"])
|
||||
(
|
||||
if data["preview"]["reddit_video_preview"]["is_gif"].as_bool().unwrap_or(false) {
|
||||
"gif"
|
||||
} else {
|
||||
"video"
|
||||
},
|
||||
&data["secure_media"]["reddit_video"]["fallback_url"],
|
||||
Some(&data["secure_media"]["reddit_video"]["hls_url"]),
|
||||
)
|
||||
} else if data["post_hint"].as_str().unwrap_or("") == "image" {
|
||||
// Handle images, whether GIFs or pics
|
||||
let preview = &data["preview"]["images"][0];
|
||||
@ -96,26 +115,26 @@ impl Media {
|
||||
|
||||
if mp4.is_object() {
|
||||
// Return the mp4 if the media is a gif
|
||||
("gif", &mp4["source"]["url"])
|
||||
("gif", &mp4["source"]["url"], None)
|
||||
} else {
|
||||
// Return the picture if the media is an image
|
||||
if data["domain"] == "i.redd.it" {
|
||||
("image", &data["url"])
|
||||
("image", &data["url"], None)
|
||||
} else {
|
||||
("image", &preview["source"]["url"])
|
||||
("image", &preview["source"]["url"], None)
|
||||
}
|
||||
}
|
||||
} else if data["is_self"].as_bool().unwrap_or_default() {
|
||||
// If type is self, return permalink
|
||||
("self", &data["permalink"])
|
||||
("self", &data["permalink"], None)
|
||||
} else if data["is_gallery"].as_bool().unwrap_or_default() {
|
||||
// If this post contains a gallery of images
|
||||
gallery = GalleryMedia::parse(&data["gallery_data"]["items"], &data["media_metadata"]);
|
||||
|
||||
("gallery", &data["url"])
|
||||
("gallery", &data["url"], None)
|
||||
} else {
|
||||
// If type can't be determined, return url
|
||||
("link", &data["url"])
|
||||
("link", &data["url"], None)
|
||||
};
|
||||
|
||||
let source = &data["preview"]["images"][0]["source"];
|
||||
@ -123,13 +142,16 @@ impl Media {
|
||||
let url = if post_type == "self" || post_type == "link" {
|
||||
url_val.as_str().unwrap_or_default().to_string()
|
||||
} else {
|
||||
format_url(url_val.as_str().unwrap_or_default()).to_string()
|
||||
format_url(url_val.as_str().unwrap_or_default())
|
||||
};
|
||||
|
||||
let alt_url = alt_url_val.map_or(String::new(), |val| format_url(val.as_str().unwrap_or_default()));
|
||||
|
||||
(
|
||||
post_type.to_string(),
|
||||
Self {
|
||||
url,
|
||||
alt_url,
|
||||
width: source["width"].as_i64().unwrap_or_default(),
|
||||
height: source["height"].as_i64().unwrap_or_default(),
|
||||
poster: format_url(source["url"].as_str().unwrap_or_default()),
|
||||
@ -179,7 +201,7 @@ pub struct Post {
|
||||
pub body: String,
|
||||
pub author: Author,
|
||||
pub permalink: String,
|
||||
pub score: String,
|
||||
pub score: (String, String),
|
||||
pub upvote_ratio: i64,
|
||||
pub post_type: String,
|
||||
pub flair: Flair,
|
||||
@ -189,7 +211,7 @@ pub struct Post {
|
||||
pub domain: String,
|
||||
pub rel_time: String,
|
||||
pub created: String,
|
||||
pub comments: String,
|
||||
pub comments: (String, String),
|
||||
pub gallery: Vec<GalleryMedia>,
|
||||
}
|
||||
|
||||
@ -200,7 +222,7 @@ impl Post {
|
||||
let post_list;
|
||||
|
||||
// Send a request to the url
|
||||
match request(path.to_string()).await {
|
||||
match json(path.to_string()).await {
|
||||
// If success, receive JSON in response
|
||||
Ok(response) => {
|
||||
res = response;
|
||||
@ -224,14 +246,14 @@ impl Post {
|
||||
let (rel_time, created) = time(data["created_utc"].as_f64().unwrap_or_default());
|
||||
let score = data["score"].as_i64().unwrap_or_default();
|
||||
let ratio: f64 = data["upvote_ratio"].as_f64().unwrap_or(1.0) * 100.0;
|
||||
let title = val(post, "title");
|
||||
let title = esc!(post, "title");
|
||||
|
||||
// Determine the type of media along with the media URL
|
||||
let (post_type, media, gallery) = Media::parse(&data).await;
|
||||
|
||||
posts.push(Self {
|
||||
id: val(post, "id"),
|
||||
title: if title.is_empty() { fallback_title.to_owned() } else { title },
|
||||
title: esc!(if title.is_empty() { fallback_title.to_owned() } else { title }),
|
||||
community: val(post, "subreddit"),
|
||||
body: rewrite_urls(&val(post, "body_html")),
|
||||
author: Author {
|
||||
@ -242,14 +264,14 @@ impl Post {
|
||||
data["author_flair_richtext"].as_array(),
|
||||
data["author_flair_text"].as_str(),
|
||||
),
|
||||
text: val(post, "link_flair_text"),
|
||||
text: esc!(post, "link_flair_text"),
|
||||
background_color: val(post, "author_flair_background_color"),
|
||||
foreground_color: val(post, "author_flair_text_color"),
|
||||
},
|
||||
distinguished: val(post, "distinguished"),
|
||||
},
|
||||
score: if data["hide_score"].as_bool().unwrap_or_default() {
|
||||
"•".to_string()
|
||||
("\u{2022}".to_string(), "Hidden".to_string())
|
||||
} else {
|
||||
format_num(score)
|
||||
},
|
||||
@ -257,6 +279,7 @@ impl Post {
|
||||
post_type,
|
||||
thumbnail: Media {
|
||||
url: format_url(val(post, "thumbnail").as_str()),
|
||||
alt_url: String::new(),
|
||||
width: data["thumbnail_width"].as_i64().unwrap_or_default(),
|
||||
height: data["thumbnail_height"].as_i64().unwrap_or_default(),
|
||||
poster: "".to_string(),
|
||||
@ -269,7 +292,7 @@ impl Post {
|
||||
data["link_flair_richtext"].as_array(),
|
||||
data["link_flair_text"].as_str(),
|
||||
),
|
||||
text: val(post, "link_flair_text"),
|
||||
text: esc!(post, "link_flair_text"),
|
||||
background_color: val(post, "link_flair_background_color"),
|
||||
foreground_color: if val(post, "link_flair_text_color") == "dark" {
|
||||
"black".to_string()
|
||||
@ -305,7 +328,7 @@ pub struct Comment {
|
||||
pub post_author: String,
|
||||
pub body: String,
|
||||
pub author: Author,
|
||||
pub score: String,
|
||||
pub score: (String, String),
|
||||
pub rel_time: String,
|
||||
pub created: String,
|
||||
pub edited: (String, String),
|
||||
@ -339,9 +362,10 @@ pub struct Subreddit {
|
||||
pub title: String,
|
||||
pub description: String,
|
||||
pub info: String,
|
||||
pub moderators: Vec<String>,
|
||||
pub icon: String,
|
||||
pub members: String,
|
||||
pub active: String,
|
||||
pub members: (String, String),
|
||||
pub active: (String, String),
|
||||
pub wiki: bool,
|
||||
}
|
||||
|
||||
@ -362,21 +386,27 @@ pub struct Preferences {
|
||||
pub layout: String,
|
||||
pub wide: String,
|
||||
pub show_nsfw: String,
|
||||
pub hide_hls_notification: String,
|
||||
pub use_hls: String,
|
||||
pub comment_sort: String,
|
||||
pub post_sort: String,
|
||||
pub subscriptions: Vec<String>,
|
||||
}
|
||||
|
||||
impl Preferences {
|
||||
// Build preferences from cookies
|
||||
pub fn new(req: Request<()>) -> Self {
|
||||
pub fn new(req: Request<Body>) -> Self {
|
||||
Self {
|
||||
theme: cookie(&req, "theme"),
|
||||
front_page: cookie(&req, "front_page"),
|
||||
layout: cookie(&req, "layout"),
|
||||
wide: cookie(&req, "wide"),
|
||||
show_nsfw: cookie(&req, "show_nsfw"),
|
||||
comment_sort: cookie(&req, "comment_sort"),
|
||||
subscriptions: cookie(&req, "subscriptions").split('+').map(String::from).filter(|s| !s.is_empty()).collect(),
|
||||
theme: setting(&req, "theme"),
|
||||
front_page: setting(&req, "front_page"),
|
||||
layout: setting(&req, "layout"),
|
||||
wide: setting(&req, "wide"),
|
||||
show_nsfw: setting(&req, "show_nsfw"),
|
||||
use_hls: setting(&req, "use_hls"),
|
||||
hide_hls_notification: setting(&req, "hide_hls_notification"),
|
||||
comment_sort: setting(&req, "comment_sort"),
|
||||
post_sort: setting(&req, "post_sort"),
|
||||
subscriptions: setting(&req, "subscriptions").split('+').map(String::from).filter(|s| !s.is_empty()).collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -393,10 +423,34 @@ pub fn param(path: &str, value: &str) -> String {
|
||||
}
|
||||
}
|
||||
|
||||
// Parse a cookie value from request
|
||||
pub fn cookie(req: &Request<()>, name: &str) -> String {
|
||||
let cookie = req.cookie(name).unwrap_or_else(|| Cookie::named(name));
|
||||
cookie.value().to_string()
|
||||
// Retrieve the value of a setting by name
|
||||
pub fn setting(req: &Request<Body>, name: &str) -> String {
|
||||
// Parse a cookie value from request
|
||||
req
|
||||
.cookie(name)
|
||||
.unwrap_or_else(|| {
|
||||
// If there is no cookie for this setting, try receiving a default from an environment variable
|
||||
if let Ok(default) = std::env::var(format!("LIBREDDIT_DEFAULT_{}", name.to_uppercase())) {
|
||||
Cookie::new(name, default)
|
||||
} else {
|
||||
Cookie::named(name)
|
||||
}
|
||||
})
|
||||
.value()
|
||||
.to_string()
|
||||
}
|
||||
|
||||
// Detect and redirect in the event of a random subreddit
|
||||
pub async fn catch_random(sub: &str, additional: &str) -> Result<Response<Body>, String> {
|
||||
if (sub == "random" || sub == "randnsfw") && !sub.contains('+') {
|
||||
let new_sub = json(format!("/r/{}/about.json?raw_json=1", sub)).await?["data"]["display_name"]
|
||||
.as_str()
|
||||
.unwrap_or_default()
|
||||
.to_string();
|
||||
Ok(redirect(format!("/r/{}{}", new_sub, additional)))
|
||||
} else {
|
||||
Err("No redirect needed".to_string())
|
||||
}
|
||||
}
|
||||
|
||||
// Direct urls to proxy if proxy is enabled
|
||||
@ -408,12 +462,12 @@ pub fn format_url(url: &str) -> String {
|
||||
Ok(parsed) => {
|
||||
let domain = parsed.domain().unwrap_or_default();
|
||||
|
||||
let capture = |regex: &str, format: &str, levels: i16| {
|
||||
let capture = |regex: &str, format: &str, segments: i16| {
|
||||
Regex::new(regex)
|
||||
.map(|re| match re.captures(url) {
|
||||
Some(caps) => match levels {
|
||||
1 => [format, &caps[1], "/"].join(""),
|
||||
2 => [format, &caps[1], "/", &caps[2], "/"].join(""),
|
||||
Some(caps) => match segments {
|
||||
1 => [format, &caps[1]].join(""),
|
||||
2 => [format, &caps[1], "/", &caps[2]].join(""),
|
||||
_ => String::new(),
|
||||
},
|
||||
None => String::new(),
|
||||
@ -421,14 +475,38 @@ pub fn format_url(url: &str) -> String {
|
||||
.unwrap_or_default()
|
||||
};
|
||||
|
||||
macro_rules! chain {
|
||||
() => {
|
||||
{
|
||||
String::new()
|
||||
}
|
||||
};
|
||||
|
||||
( $first_fn:expr, $($other_fns:expr), *) => {
|
||||
{
|
||||
let result = $first_fn;
|
||||
if result.is_empty() {
|
||||
chain!($($other_fns,)*)
|
||||
}
|
||||
else
|
||||
{
|
||||
result
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
match domain {
|
||||
"v.redd.it" => capture(r"https://v\.redd\.it/(.*)/DASH_([0-9]{2,4}(\.mp4|$))", "/vid/", 2),
|
||||
"v.redd.it" => chain!(
|
||||
capture(r"https://v\.redd\.it/(.*)/DASH_([0-9]{2,4}(\.mp4|$))", "/vid/", 2),
|
||||
capture(r"https://v\.redd\.it/(.+)/(HLSPlaylist\.m3u8.*)$", "/hls/", 2)
|
||||
),
|
||||
"i.redd.it" => capture(r"https://i\.redd\.it/(.*)", "/img/", 1),
|
||||
"a.thumbs.redditmedia.com" => capture(r"https://a\.thumbs\.redditmedia\.com/(.*)", "/thumb/a/", 1),
|
||||
"b.thumbs.redditmedia.com" => capture(r"https://b\.thumbs\.redditmedia\.com/(.*)", "/thumb/b/", 1),
|
||||
"emoji.redditmedia.com" => capture(r"https://emoji\.redditmedia\.com/(.*)/(.*)", "/emoji/", 2),
|
||||
"preview.redd.it" => capture(r"https://preview\.redd\.it/(.*)\?(.*)", "/preview/pre/", 2),
|
||||
"external-preview.redd.it" => capture(r"https://external\-preview\.redd\.it/(.*)\?(.*)", "/preview/external-pre/", 2),
|
||||
"preview.redd.it" => capture(r"https://preview\.redd\.it/(.*)", "/preview/pre/", 1),
|
||||
"external-preview.redd.it" => capture(r"https://external\-preview\.redd\.it/(.*)", "/preview/external-pre/", 1),
|
||||
"styles.redditmedia.com" => capture(r"https://styles\.redditmedia\.com/(.*)", "/style/", 1),
|
||||
"www.redditstatic.com" => capture(r"https://www\.redditstatic\.com/(.*)", "/static/", 1),
|
||||
_ => String::new(),
|
||||
@ -440,22 +518,36 @@ pub fn format_url(url: &str) -> String {
|
||||
}
|
||||
|
||||
// Rewrite Reddit links to Libreddit in body of text
|
||||
pub fn rewrite_urls(text: &str) -> String {
|
||||
match Regex::new(r#"href="(https|http|)://(www.|old.|np.|)(reddit).(com)/"#) {
|
||||
Ok(re) => re.replace_all(text, r#"href="/"#).to_string(),
|
||||
pub fn rewrite_urls(input_text: &str) -> String {
|
||||
let text1 = match Regex::new(r#"href="(https|http|)://(www.|old.|np.|amp.|)(reddit).(com)/"#) {
|
||||
Ok(re) => re.replace_all(input_text, r#"href="/"#).to_string(),
|
||||
Err(_) => String::new(),
|
||||
};
|
||||
|
||||
// Rewrite external media previews to Libreddit
|
||||
match Regex::new(r"https://external-preview\.redd\.it(.*)[^?]") {
|
||||
Ok(re) => {
|
||||
if re.is_match(&text1) {
|
||||
re.replace_all(&text1, format_url(re.find(&text1).unwrap().as_str())).to_string()
|
||||
} else {
|
||||
text1
|
||||
}
|
||||
}
|
||||
Err(_) => String::new(),
|
||||
}
|
||||
}
|
||||
|
||||
// Append `m` and `k` for millions and thousands respectively
|
||||
pub fn format_num(num: i64) -> String {
|
||||
if num >= 1_000_000 {
|
||||
pub fn format_num(num: i64) -> (String, String) {
|
||||
let truncated = if num >= 1_000_000 || num <= -1_000_000 {
|
||||
format!("{}m", num / 1_000_000)
|
||||
} else if num >= 1000 {
|
||||
} else if num >= 1000 || num <= -1000 {
|
||||
format!("{}k", num / 1_000)
|
||||
} else {
|
||||
num.to_string()
|
||||
}
|
||||
};
|
||||
|
||||
(truncated, num.to_string())
|
||||
}
|
||||
|
||||
// Parse a relative and absolute time from a UNIX timestamp
|
||||
@ -483,23 +575,51 @@ pub fn val(j: &Value, k: &str) -> String {
|
||||
j["data"][k].as_str().unwrap_or_default().to_string()
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! esc {
|
||||
($f:expr) => {
|
||||
$f.replace('<', "<").replace('>', ">")
|
||||
};
|
||||
($j:expr, $k:expr) => {
|
||||
$j["data"][$k].as_str().unwrap_or_default().to_string().replace('<', "<").replace('>', ">")
|
||||
};
|
||||
}
|
||||
|
||||
// Escape < and > to accurately render HTML
|
||||
// pub fn esc(j: &Value, k: &str) -> String {
|
||||
// val(j,k)
|
||||
// // .replace('&', "&")
|
||||
// .replace('<', "<")
|
||||
// .replace('>', ">")
|
||||
// // .replace('"', """)
|
||||
// // .replace('\'', "'")
|
||||
// // .replace('/', "/")
|
||||
// }
|
||||
|
||||
//
|
||||
// NETWORKING
|
||||
//
|
||||
|
||||
pub fn template(t: impl Template) -> tide::Result {
|
||||
Ok(Response::builder(200).content_type("text/html").body(t.render().unwrap_or_default()).build())
|
||||
pub fn template(t: impl Template) -> Result<Response<Body>, String> {
|
||||
Ok(
|
||||
Response::builder()
|
||||
.status(200)
|
||||
.header("content-type", "text/html")
|
||||
.body(t.render().unwrap_or_default().into())
|
||||
.unwrap_or_default(),
|
||||
)
|
||||
}
|
||||
|
||||
pub fn redirect(path: String) -> Response {
|
||||
Response::builder(302)
|
||||
.content_type("text/html")
|
||||
pub fn redirect(path: String) -> Response<Body> {
|
||||
Response::builder()
|
||||
.status(302)
|
||||
.header("content-type", "text/html")
|
||||
.header("Location", &path)
|
||||
.body(format!("Redirecting to <a href=\"{0}\">{0}</a>...", path))
|
||||
.build()
|
||||
.body(format!("Redirecting to <a href=\"{0}\">{0}</a>...", path).into())
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
pub async fn error(req: Request<()>, msg: String) -> tide::Result {
|
||||
pub async fn error(req: Request<Body>, msg: String) -> Result<Response<Body>, String> {
|
||||
let body = ErrorTemplate {
|
||||
msg,
|
||||
prefs: Preferences::new(req),
|
||||
@ -507,57 +627,5 @@ pub async fn error(req: Request<()>, msg: String) -> tide::Result {
|
||||
.render()
|
||||
.unwrap_or_default();
|
||||
|
||||
Ok(Response::builder(404).content_type("text/html").body(body).build())
|
||||
}
|
||||
|
||||
// Make a request to a Reddit API and parse the JSON response
|
||||
#[cached(size = 100, time = 30, result = true)]
|
||||
pub async fn request(path: String) -> Result<Value, String> {
|
||||
let url = format!("https://www.reddit.com{}", path);
|
||||
// Build reddit-compliant user agent for Libreddit
|
||||
let user_agent = format!("web:libreddit:{}", env!("CARGO_PKG_VERSION"));
|
||||
|
||||
// Send request using surf
|
||||
let req = surf::get(&url).header("User-Agent", user_agent.as_str());
|
||||
let client = surf::client().with(surf::middleware::Redirect::new(5));
|
||||
|
||||
let res = client.send(req).await;
|
||||
|
||||
let err = |msg: &str, e: String| -> Result<Value, String> {
|
||||
eprintln!("{} - {}: {}", url, msg, e);
|
||||
Err(msg.to_string())
|
||||
};
|
||||
|
||||
match res {
|
||||
Ok(mut response) => match response.take_body().into_string().await {
|
||||
// If response is success
|
||||
Ok(body) => {
|
||||
// Parse the response from Reddit as JSON
|
||||
let parsed: Result<Value, Error> = from_str(&body);
|
||||
match parsed {
|
||||
Ok(json) => {
|
||||
// If Reddit returned an error
|
||||
if json["error"].is_i64() {
|
||||
Err(
|
||||
json["reason"]
|
||||
.as_str()
|
||||
.unwrap_or_else(|| {
|
||||
json["message"].as_str().unwrap_or_else(|| {
|
||||
eprintln!("{} - Error parsing reddit error", url);
|
||||
"Error parsing reddit error"
|
||||
})
|
||||
})
|
||||
.to_string(),
|
||||
)
|
||||
} else {
|
||||
Ok(json)
|
||||
}
|
||||
}
|
||||
Err(e) => err("Failed to parse page JSON data", e.to_string()),
|
||||
}
|
||||
}
|
||||
Err(e) => err("Couldn't parse request body", e.to_string()),
|
||||
},
|
||||
Err(e) => err("Couldn't send request to Reddit", e.to_string()),
|
||||
}
|
||||
Ok(Response::builder().status(404).header("content-type", "text/html").body(body.into()).unwrap_or_default())
|
||||
}
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 5.5 KiB After Width: | Height: | Size: 8.0 KiB |
5
static/hls.min.js
vendored
Normal file
5
static/hls.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
77
static/playHLSVideo.js
Normal file
77
static/playHLSVideo.js
Normal file
@ -0,0 +1,77 @@
|
||||
// @license http://www.gnu.org/licenses/agpl-3.0.html AGPL-3.0
|
||||
(function () {
|
||||
if (Hls.isSupported()) {
|
||||
var videoSources = document.querySelectorAll("video source[type='application/vnd.apple.mpegurl']");
|
||||
videoSources.forEach(function (source) {
|
||||
var playlist = source.src;
|
||||
|
||||
var oldVideo = source.parentNode;
|
||||
var autoplay = oldVideo.classList.contains("hls_autoplay");
|
||||
|
||||
// If HLS is supported natively then don't use hls.js
|
||||
if (oldVideo.canPlayType(source.type)) {
|
||||
if (autoplay) {
|
||||
oldVideo.play();
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Replace video with copy that will have all "source" elements removed
|
||||
var newVideo = oldVideo.cloneNode(true);
|
||||
var allSources = newVideo.querySelectorAll("source");
|
||||
allSources.forEach(function (source) {
|
||||
source.remove();
|
||||
});
|
||||
|
||||
// Empty source to enable play event
|
||||
newVideo.src = "about:blank";
|
||||
|
||||
oldVideo.parentNode.replaceChild(newVideo, oldVideo);
|
||||
|
||||
function initializeHls() {
|
||||
newVideo.removeEventListener('play', initializeHls);
|
||||
|
||||
var hls = new Hls({ autoStartLoad: false });
|
||||
hls.loadSource(playlist);
|
||||
hls.attachMedia(newVideo);
|
||||
hls.on(Hls.Events.MANIFEST_PARSED, function () {
|
||||
hls.loadLevel = hls.levels.length - 1;
|
||||
hls.startLoad();
|
||||
newVideo.play();
|
||||
});
|
||||
|
||||
hls.on(Hls.Events.ERROR, function (event, data) {
|
||||
var errorType = data.type;
|
||||
var errorFatal = data.fatal;
|
||||
if (errorFatal) {
|
||||
switch (errorType) {
|
||||
case Hls.ErrorType.NETWORK_ERROR:
|
||||
hls.startLoad();
|
||||
break;
|
||||
case Hls.ErrorType.MEDIA_ERROR:
|
||||
hls.recoverMediaError();
|
||||
break;
|
||||
default:
|
||||
hls.destroy();
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
console.error("HLS error", data);
|
||||
});
|
||||
}
|
||||
|
||||
newVideo.addEventListener('play', initializeHls);
|
||||
|
||||
if (autoplay) {
|
||||
newVideo.play();
|
||||
}
|
||||
});
|
||||
} else {
|
||||
var videos = document.querySelectorAll("video.hls_autoplay");
|
||||
videos.forEach(function (video) {
|
||||
video.setAttribute("autoplay", "");
|
||||
});
|
||||
}
|
||||
})();
|
||||
// @license-end
|
210
static/style.css
210
static/style.css
@ -65,6 +65,75 @@
|
||||
--shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
/* Dracula theme setting */
|
||||
.dracula {
|
||||
--accent: #bd93f9;
|
||||
--green: #50fa7b;
|
||||
--text: #f8f8f2;
|
||||
--foreground: #3d4051;
|
||||
--background: #282a36;
|
||||
--outside: #44475a;
|
||||
--post: #44475a;
|
||||
--panel-border: 2px solid #44475a;
|
||||
--highlighted: #4e5267;
|
||||
--shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
/* Nord theme setting */
|
||||
.nord {
|
||||
--accent: #8fbcbb;
|
||||
--green: #a3be8c;
|
||||
--text: #eceff4;
|
||||
--foreground: #3b4252;
|
||||
--background: #2e3440;
|
||||
--outside: #434c5e;
|
||||
--post: #434c5e;
|
||||
--panel-border: 2px solid #4c566a;
|
||||
--highlighted: #3b4252;
|
||||
--shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
/* Laserwave theme setting */
|
||||
.laserwave {
|
||||
--accent: #eb64b9;
|
||||
--green: #74dfc4;
|
||||
--text: #e0dfe1;
|
||||
--foreground: #302a36;
|
||||
--background: #27212e;
|
||||
--outside: #3e3647;
|
||||
--post: #3e3647;
|
||||
--panel-border: 2px solid #2f2738;
|
||||
--highlighted: #302a36;
|
||||
--shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
/* Violet theme setting */
|
||||
.violet {
|
||||
--accent: #7c71dd;
|
||||
--green: #5cff85;
|
||||
--text: white;
|
||||
--foreground: #1F2347;
|
||||
--background: #12152b;
|
||||
--outside: #181c3a;
|
||||
--post: #181c3a;
|
||||
--panel-border: 1px solid #1F2347;
|
||||
--highlighted: #1F2347;
|
||||
--shadow: 0 2px 5px rgba(0, 0, 0, 0.5);
|
||||
}
|
||||
|
||||
/* Gold theme setting */
|
||||
.gold {
|
||||
--accent: #f2aa4c;
|
||||
--green: #5cff85;
|
||||
--text: white;
|
||||
--foreground: #234;
|
||||
--background: #101820;
|
||||
--outside: #1b2936;
|
||||
--post: #1b2936;
|
||||
--panel-border: 0px solid black;
|
||||
--highlighted: #234;
|
||||
--shadow: 0 2px 5px rgba(0, 0, 0, 0.5);
|
||||
}
|
||||
|
||||
/* General */
|
||||
|
||||
@ -281,7 +350,7 @@ aside {
|
||||
|
||||
/* Subscriptions */
|
||||
|
||||
#sub_subscription {
|
||||
#sub_subscription, #user_subscription {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
@ -522,7 +591,26 @@ button.submit:hover > svg { stroke: var(--accent); }
|
||||
|
||||
.search_subreddit {
|
||||
padding: 16px 20px;
|
||||
display: block;
|
||||
display: flex;
|
||||
}
|
||||
|
||||
.search_subreddit_left {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.search_subreddit_left:not(:empty) {
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.search_subreddit_left img {
|
||||
width: 35px;
|
||||
height: 35px;
|
||||
border-radius: 100%;
|
||||
}
|
||||
|
||||
.search_subreddit_right {
|
||||
overflow: auto;
|
||||
}
|
||||
|
||||
a.search_subreddit:hover {
|
||||
@ -576,6 +664,7 @@ a.search_subreddit:hover {
|
||||
"post_score post_title post_thumbnail" 1fr
|
||||
"post_score post_media post_thumbnail" auto
|
||||
"post_score post_body post_thumbnail" auto
|
||||
"post_score post_notification post_thumbnail" auto
|
||||
"post_score post_footer post_thumbnail" auto
|
||||
/ minmax(40px, auto) minmax(0, 1fr) fit-content(min(20%, 152px));
|
||||
}
|
||||
@ -618,6 +707,17 @@ a.search_subreddit:hover {
|
||||
grid-area: post_title;
|
||||
}
|
||||
|
||||
.post_notification {
|
||||
grid-area: post_notification;
|
||||
margin: 5px 15px;
|
||||
text-align: center;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.post_notification a {
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
.post_flair {
|
||||
background: var(--accent);
|
||||
color: var(--background);
|
||||
@ -628,13 +728,13 @@ a.search_subreddit:hover {
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.author_flair, .post_flair:empty {
|
||||
.author_flair:empty, .post_flair:empty {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.emoji {
|
||||
width: 1em;
|
||||
height: 1em;
|
||||
width: 1.25em;
|
||||
height: 1.25em;
|
||||
display: inline-block;
|
||||
background-size: contain;
|
||||
background-position: 50% 50%;
|
||||
@ -1084,6 +1184,8 @@ input[type="submit"] {
|
||||
|
||||
.md table {
|
||||
margin: 5px;
|
||||
display: block;
|
||||
overflow-x: auto;
|
||||
}
|
||||
|
||||
.md code {
|
||||
@ -1115,46 +1217,6 @@ td, th {
|
||||
|
||||
/* Mobile */
|
||||
|
||||
@media screen and (max-width: 480px) {
|
||||
#version { display: none; }
|
||||
|
||||
.post {
|
||||
grid-template: "post_header post_header post_thumbnail" auto
|
||||
"post_title post_title post_thumbnail" 1fr
|
||||
"post_media post_media post_thumbnail" auto
|
||||
"post_body post_body post_thumbnail" auto
|
||||
"post_score post_footer post_thumbnail" auto
|
||||
/ auto 1fr fit-content(min(20%, 152px));
|
||||
}
|
||||
|
||||
.post_score {
|
||||
margin: 5px 0px 20px 15px;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.compact .post_score { padding: 0; }
|
||||
|
||||
.post_score::before { content: "↑" }
|
||||
|
||||
.post_header { font-size: 14px; }
|
||||
.post_footer { margin-left: 15px; }
|
||||
|
||||
.replies > .comment {
|
||||
margin-left: -25px;
|
||||
padding: 5px 0;
|
||||
}
|
||||
|
||||
.comment_left {
|
||||
min-width: 45px;
|
||||
padding: 5px 0px;
|
||||
}
|
||||
|
||||
.comment_author { margin-left: 10px; }
|
||||
.comment_score { min-width: 35px; }
|
||||
.comment_data::marker { font-size: 18px; }
|
||||
.created { width: 100%; }
|
||||
}
|
||||
|
||||
@media screen and (max-width: 800px) {
|
||||
body { padding-top: 120px }
|
||||
|
||||
@ -1196,3 +1258,61 @@ td, th {
|
||||
#logo, #links { margin-bottom: 5px; }
|
||||
#searchbox { width: calc(100vw - 35px); }
|
||||
}
|
||||
|
||||
@media screen and (max-width: 480px) {
|
||||
body { padding-top: 100px; }
|
||||
#version { display: none; }
|
||||
|
||||
.post {
|
||||
grid-template: "post_header post_header post_thumbnail" auto
|
||||
"post_title post_title post_thumbnail" 1fr
|
||||
"post_media post_media post_thumbnail" auto
|
||||
"post_body post_body post_thumbnail" auto
|
||||
"post_notification post_notification post_thumbnail" auto
|
||||
"post_score post_footer post_thumbnail" auto
|
||||
/ auto 1fr fit-content(min(20%, 152px));
|
||||
}
|
||||
|
||||
.post_score {
|
||||
margin: 5px 0px 20px 15px;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.compact .post_score { padding: 0; }
|
||||
|
||||
.post_score::before { content: "↑" }
|
||||
|
||||
.post_header { font-size: 14px; }
|
||||
.post_footer { margin-left: 15px; }
|
||||
|
||||
.replies > .comment {
|
||||
margin-left: -12px;
|
||||
padding: 5px 0;
|
||||
}
|
||||
|
||||
.comment_left {
|
||||
min-width: auto;
|
||||
padding: 5px 0px;
|
||||
align-items: initial;
|
||||
margin-top: -5px;
|
||||
}
|
||||
|
||||
.line {
|
||||
margin-left: 5px;
|
||||
}
|
||||
|
||||
/* .thread { margin-left: -5px; } */
|
||||
.comment_right { padding: 5px 0 10px 2px; }
|
||||
.comment_author { margin-left: 12px; }
|
||||
.comment_data { margin-left: 12px; }
|
||||
.comment_data::marker { font-size: 25px; }
|
||||
.created { width: 100%; }
|
||||
|
||||
.comment_score {
|
||||
min-width: 32px;
|
||||
height: 20px;
|
||||
font-size: 15px;
|
||||
padding: 7px 0px;
|
||||
margin-right: -5px;
|
||||
}
|
||||
}
|
||||
|
@ -15,11 +15,11 @@
|
||||
<!-- Android -->
|
||||
<meta name="mobile-web-app-capable" content="yes">
|
||||
<!-- iOS Logo -->
|
||||
<link href="/touch-icon-iphone.png/" rel="apple-touch-icon">
|
||||
<link href="/touch-icon-iphone.png" rel="apple-touch-icon">
|
||||
<!-- PWA Manifest -->
|
||||
<link rel="manifest" type="application/json" href="/manifest.json/">
|
||||
<link rel="shortcut icon" type="image/x-icon" href="/favicon.ico/">
|
||||
<link rel="stylesheet" type="text/css" href="/style.css/">
|
||||
<link rel="manifest" type="application/json" href="/manifest.json">
|
||||
<link rel="shortcut icon" type="image/x-icon" href="/favicon.ico">
|
||||
<link rel="stylesheet" type="text/css" href="/style.css">
|
||||
{% endblock %}
|
||||
</head>
|
||||
<body class="
|
||||
@ -35,7 +35,7 @@
|
||||
</div>
|
||||
{% block search %}{% endblock %}
|
||||
<div id="links">
|
||||
<a id="settings_link" href="/settings/">
|
||||
<a id="settings_link" href="/settings">
|
||||
<span>settings</span>
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<title>settings</title>
|
||||
|
@ -5,12 +5,12 @@
|
||||
{% else if kind == "t1" %}
|
||||
<div id="{{ id }}" class="comment">
|
||||
<div class="comment_left">
|
||||
<p class="comment_score">{{ score }}</p>
|
||||
<p class="comment_score" title="{{ score.1 }}">{{ score.0 }}</p>
|
||||
<div class="line"></div>
|
||||
</div>
|
||||
<details class="comment_right" open>
|
||||
<summary class="comment_data">
|
||||
<a class="comment_author {{ author.distinguished }} {% if author.name == post_author %}op{% endif %}" href="/u/{{ author.name }}">u/{{ author.name }}</a>
|
||||
<a class="comment_author {{ author.distinguished }} {% if author.name == post_author %}op{% endif %}" href="/user/{{ author.name }}">u/{{ author.name }}</a>
|
||||
{% if author.flair.flair_parts.len() > 0 %}
|
||||
<small class="author_flair">{% call utils::render_flair(author.flair.flair_parts) %}</small>
|
||||
{% endif %}
|
||||
|
@ -13,7 +13,6 @@
|
||||
<!-- Meta Tags -->
|
||||
<meta name="author" content="u/{{ post.author.name }}">
|
||||
<meta name="title" content="{{ post.title }} - r/{{ post.community }}">
|
||||
<meta name="description" content="View on Libreddit, an alternative private front-end to Reddit.">
|
||||
<meta property="og:type" content="website">
|
||||
<meta property="og:url" content="{{ post.permalink }}">
|
||||
<meta property="og:title" content="{{ post.title }} - r/{{ post.community }}">
|
||||
@ -38,7 +37,7 @@
|
||||
<p class="post_header">
|
||||
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
|
||||
<span class="dot">•</span>
|
||||
<a class="post_author" href="/u/{{ post.author.name }}">u/{{ post.author.name }}</a>
|
||||
<a class="post_author" href="/user/{{ post.author.name }}">u/{{ post.author.name }}</a>
|
||||
{% if post.author.flair.flair_parts.len() > 0 %}
|
||||
<small class="author_flair">{% call utils::render_flair(post.author.flair.flair_parts) %}</small>
|
||||
{% endif %}
|
||||
@ -48,7 +47,7 @@
|
||||
<p class="post_title">
|
||||
<a href="{{ post.permalink }}">{{ post.title }}</a>
|
||||
{% if post.flair.flair_parts.len() > 0 %}
|
||||
<a href="/r/{{ post.community }}/search/?q=flair_name%3A%22{{ post.flair.text }}%22&restrict_sr=on"
|
||||
<a href="/r/{{ post.community }}/search?q=flair_name%3A%22{{ post.flair.text }}%22&restrict_sr=on"
|
||||
class="post_flair"
|
||||
style="color:{{ post.flair.foreground_color }}; background:{{ post.flair.background_color }};">{% call utils::render_flair(post.flair.flair_parts) %}</a>
|
||||
{% endif %}
|
||||
@ -69,7 +68,17 @@
|
||||
</svg>
|
||||
</a>
|
||||
{% else if post.post_type == "video" || post.post_type == "gif" %}
|
||||
{% if prefs.use_hls == "on" && !post.media.alt_url.is_empty() %}
|
||||
<script src="/hls.min.js"></script>
|
||||
<video class="post_media_video short hls_autoplay" width="{{ post.media.width }}" height="{{ post.media.height }}" poster="{{ post.media.poster }}" controls preload="none">
|
||||
<source src="{{ post.media.alt_url }}" type="application/vnd.apple.mpegurl" />
|
||||
<source src="{{ post.media.url }}" type="video/mp4" />
|
||||
</video>
|
||||
<script src="/playHLSVideo.js"></script>
|
||||
{% else %}
|
||||
<video class="post_media_video" src="{{ post.media.url }}" controls autoplay loop><a href={{ post.media.url }}>Video</a></video>
|
||||
{% call utils::render_hls_notification(post.permalink[1..]) %}
|
||||
{% endif %}
|
||||
{% else if post.post_type == "gallery" %}
|
||||
<div class="gallery">
|
||||
{% for image in post.gallery -%}
|
||||
@ -78,23 +87,23 @@
|
||||
<figcaption>
|
||||
<p>{{ image.caption }}</p>
|
||||
{% if image.outbound_url.len() > 0 %}
|
||||
<p><a class="outbound_url" href="{{ image.outbound_url }}">{{ image.outbound_url }}</a>
|
||||
<p><a class="outbound_url" href="{{ image.outbound_url }}" rel="nofollow">{{ image.outbound_url }}</a>
|
||||
{% endif %}
|
||||
</figcaption>
|
||||
</figure>
|
||||
{%- endfor %}
|
||||
</div>
|
||||
{% else if post.post_type == "link" %}
|
||||
<a id="post_url" href="{{ post.media.url }}">{{ post.media.url }}</a>
|
||||
<a id="post_url" href="{{ post.media.url }}" rel="nofollow">{{ post.media.url }}</a>
|
||||
{% endif %}
|
||||
|
||||
<!-- POST BODY -->
|
||||
<div class="post_body">{{ post.body }}</div>
|
||||
<div class="post_score">{{ post.score }}<span class="label"> Upvotes</span></div>
|
||||
<div class="post_score" title="{{ post.score.1 }}">{{ post.score.0 }}<span class="label"> Upvotes</span></div>
|
||||
<div class="post_footer">
|
||||
<ul id="post_links">
|
||||
<li><a href="/{{ post.id }}">permalink</a></li>
|
||||
<li><a href="https://reddit.com/{{ post.id }}">reddit</a></li>
|
||||
<li><a href="https://reddit.com/{{ post.id }}" rel="nofollow">reddit</a></li>
|
||||
</ul>
|
||||
<p>{{ post.upvote_ratio }}% Upvoted</p>
|
||||
</div>
|
||||
|
@ -34,12 +34,15 @@
|
||||
<div id="search_subreddits">
|
||||
{% for subreddit in subreddits %}
|
||||
<a href="{{ subreddit.url }}" class="search_subreddit">
|
||||
<div class="search_subreddit_left">{% if subreddit.icon != "" %}<img src="{{ subreddit.icon }}" alt="r/{{ subreddit.name }} icon">{% endif %}</div>
|
||||
<div class="search_subreddit_right">
|
||||
<p class="search_subreddit_header">
|
||||
<span class="search_subreddit_name">{{ subreddit.name }}</span>
|
||||
<span class="dot">•</span>
|
||||
<span class="search_subreddit_members">{{ subreddit.subscribers }} Members</span>
|
||||
<span class="search_subreddit_members" title="{{ subreddit.subscribers.1 }} Members">{{ subreddit.subscribers.0 }} Members</span>
|
||||
</p>
|
||||
<p class="search_subreddit_description">{{ subreddit.description }}</p>
|
||||
</div>
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
@ -52,7 +55,7 @@
|
||||
{% else %}
|
||||
<div class="comment">
|
||||
<div class="comment_left">
|
||||
<p class="comment_score">{{ post.score }}</p>
|
||||
<p class="comment_score" title="{{ post.score.1 }}">{{ post.score.0 }}</p>
|
||||
<div class="line"></div>
|
||||
</div>
|
||||
<details class="comment_right" open>
|
||||
|
@ -9,13 +9,13 @@
|
||||
|
||||
{% block content %}
|
||||
<div id="settings">
|
||||
<form action="/settings/" method="POST">
|
||||
<form action="/settings" method="POST">
|
||||
<div class="prefs">
|
||||
<p>Appearance</p>
|
||||
<div id="theme">
|
||||
<label for="theme">Theme:</label>
|
||||
<select name="theme">
|
||||
{% call utils::options(prefs.theme, ["system", "light", "dark", "black"], "system") %}
|
||||
{% call utils::options(prefs.theme, ["system", "light", "dark", "black", "dracula", "nord", "laserwave", "violet", "gold"], "system") %}
|
||||
</select>
|
||||
</div>
|
||||
<p>Interface</p>
|
||||
@ -33,9 +33,16 @@
|
||||
</div>
|
||||
<div id="wide">
|
||||
<label for="wide">Wide UI:</label>
|
||||
<input type="hidden" value="off" name="wide">
|
||||
<input type="checkbox" name="wide" {% if prefs.wide == "on" %}checked{% endif %}>
|
||||
</div>
|
||||
<p>Content</p>
|
||||
<div id="post_sort">
|
||||
<label for="post_sort" title="Applies only to subreddit feeds">Default subreddit post sort:</label>
|
||||
<select name="post_sort">
|
||||
{% call utils::options(prefs.post_sort, ["hot", "new", "top", "rising", "controversial"], "hot") %}
|
||||
</select>
|
||||
</div>
|
||||
<div id="comment_sort">
|
||||
<label for="comment_sort">Default comment sort:</label>
|
||||
<select name="comment_sort">
|
||||
@ -44,17 +51,28 @@
|
||||
</div>
|
||||
<div id="show_nsfw">
|
||||
<label for="show_nsfw">Show NSFW posts:</label>
|
||||
<input type="hidden" value="off" name="show_nsfw">
|
||||
<input type="checkbox" name="show_nsfw" {% if prefs.show_nsfw == "on" %}checked{% endif %}>
|
||||
</div>
|
||||
<div id="use_hls">
|
||||
<label for="use_hls">Use HLS for videos</label>
|
||||
<input type="hidden" value="off" name="use_hls">
|
||||
<input type="checkbox" name="use_hls" {% if prefs.use_hls == "on" %}checked{% endif %}>
|
||||
</div>
|
||||
<div id="hide_hls_notification">
|
||||
<label for="hide_hls_notification">Hide notification about possible HLS usage</label>
|
||||
<input type="hidden" value="off" name="hide_hls_notification">
|
||||
<input type="checkbox" name="hide_hls_notification" {% if prefs.hide_hls_notification == "on" %}checked{% endif %}>
|
||||
</div>
|
||||
<input id="save" type="submit" value="Save">
|
||||
</div>
|
||||
</form>
|
||||
{% if prefs.subscriptions.len() > 0 %}
|
||||
<div class="prefs" id="settings_subs">
|
||||
<p>Subscribed Subreddits</p>
|
||||
<p>Subscribed Feeds</p>
|
||||
{% for sub in prefs.subscriptions %}
|
||||
<div>
|
||||
<span>{{ sub }}</span>
|
||||
<span>{% if sub.starts_with("u_") -%}{{ format!("u/{}", &sub[2..]) }}{% else -%}{{ format!("r/{}", sub) }}{% endif -%}</span>
|
||||
<form action="/r/{{ sub }}/unsubscribe/?redirect=settings" method="POST">
|
||||
<button class="unsubscribe">Unsubscribe</button>
|
||||
</form>
|
||||
@ -65,7 +83,7 @@
|
||||
|
||||
<div id="settings_note">
|
||||
<p><b>Note:</b> settings and subscriptions are saved in browser cookies. Clearing your cookies will reset them.</p><br>
|
||||
<p>You can restore your current settings and subscriptions after clearing your cookies using <a href="/settings/restore/?theme={{ prefs.theme }}&front_page={{ prefs.front_page }}&layout={{ prefs.layout }}&wide={{ prefs.wide }}&comment_sort={{ prefs.comment_sort }}&show_nsfw={{ prefs.show_nsfw }}&subscriptions={{ prefs.subscriptions.join("%2B") }}">this link</a>.</p>
|
||||
<p>You can restore your current settings and subscriptions after clearing your cookies using <a href="/settings/restore/?theme={{ prefs.theme }}&front_page={{ prefs.front_page }}&layout={{ prefs.layout }}&wide={{ prefs.wide }}&comment_sort={{ prefs.comment_sort }}&show_nsfw={{ prefs.show_nsfw }}&use_hls={{ prefs.use_hls }}&hide_hls_notification={{ prefs.hide_hls_notification }}&subscriptions={{ prefs.subscriptions.join("%2B") }}">this link</a>.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
@ -40,7 +40,7 @@
|
||||
</form>
|
||||
|
||||
{% if sub.name.contains("+") %}
|
||||
<form action="/r/{{ sub.name }}/subscribe/" method="POST">
|
||||
<form action="/r/{{ sub.name }}/subscribe" method="POST">
|
||||
<button id="multisub" class="subscribe" title="Subscribe to each sub in this multireddit">Subscribe to Multireddit</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
@ -52,6 +52,10 @@
|
||||
{% call utils::post_in_list(post) %}
|
||||
{% endif %}
|
||||
{% endfor %}
|
||||
{% if prefs.use_hls == "on" %}
|
||||
<script src="/hls.min.js"></script>
|
||||
<script src="/playHLSVideo.js"></script>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<footer>
|
||||
@ -81,16 +85,16 @@
|
||||
<div id="sub_details">
|
||||
<label>Members</label>
|
||||
<label>Active</label>
|
||||
<div>{{ sub.members }}</div>
|
||||
<div>{{ sub.active }}</div>
|
||||
<div title="{{ sub.members.1 }}">{{ sub.members.0 }}</div>
|
||||
<div title="{{ sub.active.1 }}">{{ sub.active.0 }}</div>
|
||||
</div>
|
||||
<div id="sub_subscription">
|
||||
{% if prefs.subscriptions.contains(sub.name) %}
|
||||
<form action="/r/{{ sub.name }}/unsubscribe/" method="POST">
|
||||
<form action="/r/{{ sub.name }}/unsubscribe" method="POST">
|
||||
<button class="unsubscribe">Unsubscribe</button>
|
||||
</form>
|
||||
{% else %}
|
||||
<form action="/r/{{ sub.name }}/subscribe/" method="POST">
|
||||
<form action="/r/{{ sub.name }}/subscribe" method="POST">
|
||||
<button class="subscribe">Subscribe</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
@ -99,7 +103,17 @@
|
||||
</div>
|
||||
<details class="panel" id="sidebar">
|
||||
<summary id="sidebar_label">Sidebar</summary>
|
||||
<div id="sidebar_contents">{{ sub.info }}</div>
|
||||
<div id="sidebar_contents">
|
||||
{{ sub.info }}
|
||||
<hr>
|
||||
<h2>Moderators</h2>
|
||||
<br>
|
||||
<ul>
|
||||
{% for moderator in sub.moderators %}
|
||||
<li><a style="color: var(--accent)" href="/u/{{ moderator }}">{{ moderator }}</a></li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
</details>
|
||||
</aside>
|
||||
{% endif %}
|
||||
|
@ -37,7 +37,7 @@
|
||||
{% else %}
|
||||
<div class="comment">
|
||||
<div class="comment_left">
|
||||
<p class="comment_score">{{ post.score }}</p>
|
||||
<p class="comment_score" title="{{ post.score.1 }}">{{ post.score.0 }}</p>
|
||||
<div class="line"></div>
|
||||
</div>
|
||||
<details class="comment_right" open>
|
||||
@ -65,7 +65,7 @@
|
||||
</div>
|
||||
<aside>
|
||||
<div class="panel" id="user">
|
||||
<img id="user_icon" src="{{ user.icon }}">
|
||||
<img id="user_icon" src="{{ user.icon }}" alt="User icon">
|
||||
<p id="user_title">{{ user.title }}</p>
|
||||
<p id="user_name">u/{{ user.name }}</p>
|
||||
<div id="user_description">{{ user.description }}</div>
|
||||
@ -75,6 +75,18 @@
|
||||
<div>{{ user.karma }}</div>
|
||||
<div>{{ user.created }}</div>
|
||||
</div>
|
||||
<div id="user_subscription">
|
||||
{% let name = ["u_", user.name.as_str()].join("") %}
|
||||
{% if prefs.subscriptions.contains(name) %}
|
||||
<form action="/r/u_{{ user.name }}/unsubscribe" method="POST">
|
||||
<button class="unsubscribe">Unfollow</button>
|
||||
</form>
|
||||
{% else %}
|
||||
<form action="/r/u_{{ user.name }}/subscribe" method="POST">
|
||||
<button class="subscribe">Follow</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</aside>
|
||||
</main>
|
||||
|
@ -15,7 +15,7 @@
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro search(root, search) -%}
|
||||
<form action="{% if root != "/r/" && !root.is_empty() %}{{ root }}{% endif %}/search/" id="searchbox">
|
||||
<form action="{% if root != "/r/" && !root.is_empty() %}{{ root }}{% endif %}/search" id="searchbox">
|
||||
<input id="search" type="text" name="q" placeholder="Search" title="Search libreddit" value="{{ search }}">
|
||||
{% if root != "/r/" && !root.is_empty() %}
|
||||
<div id="inside">
|
||||
@ -55,10 +55,22 @@
|
||||
{% endif %}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro render_hls_notification(redirect_url) -%}
|
||||
{% if post.post_type == "video" && !post.media.alt_url.is_empty() && prefs.hide_hls_notification != "on" %}
|
||||
<div class="post_notification"><p><a href="/settings/update/?use_hls=on&redirect={{ redirect_url }}">Enable HSL</a> to view with audio, or <a href="/settings/update/?hide_hls_notification=on&redirect={{ redirect_url }}">disable this notification</a></p></div>
|
||||
{% endif %}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro post_in_list(post) -%}
|
||||
<div class="post {% if post.flags.stickied %}stickied{% endif %}">
|
||||
<div class="post {% if post.flags.stickied %}stickied{% endif %}" id="{{ post.id }}">
|
||||
<p class="post_header">
|
||||
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
|
||||
{% let community -%}
|
||||
{% if post.community.starts_with("u_") -%}
|
||||
{% let community = format!("u/{}", &post.community[2..]) -%}
|
||||
{% else -%}
|
||||
{% let community = format!("r/{}", post.community) -%}
|
||||
{% endif -%}
|
||||
<a class="post_subreddit" href="/{{ community }}">{{ community }}</a>
|
||||
<span class="dot">•</span>
|
||||
<a class="post_author" href="/u/{{ post.author.name }}">u/{{ post.author.name }}</a>
|
||||
<span class="dot">•</span>
|
||||
@ -66,9 +78,10 @@
|
||||
</p>
|
||||
<p class="post_title">
|
||||
{% if post.flair.flair_parts.len() > 0 %}
|
||||
<a href="/r/{{ post.community }}/search/?q=flair_name%3A%22{{ post.flair.text }}%22&restrict_sr=on"
|
||||
<a href="/r/{{ post.community }}/search?q=flair_name%3A%22{{ post.flair.text }}%22&restrict_sr=on"
|
||||
class="post_flair"
|
||||
style="color:{{ post.flair.foreground_color }}; background:{{ post.flair.background_color }};">{% call render_flair(post.flair.flair_parts) %}</a>
|
||||
style="color:{{ post.flair.foreground_color }}; background:{{ post.flair.background_color }};"
|
||||
dir="ltr">{% call render_flair(post.flair.flair_parts) %}</a>
|
||||
{% endif %}
|
||||
<a href="{{ post.permalink }}">{{ post.title }}</a>{% if post.flags.nsfw %} <small class="nsfw">NSFW</small>{% endif %}
|
||||
</p>
|
||||
@ -86,11 +99,19 @@
|
||||
</svg>
|
||||
</a>
|
||||
{% else if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "gif" %}
|
||||
<video class="post_media_video short" src="{{ post.media.url }}" width="{{ post.media.width }}px" height="{{ post.media.height }}px" controls loop autoplay><a href={{ post.media.url }}>Video</a></video>
|
||||
<video class="post_media_video short" src="{{ post.media.url }}" width="{{ post.media.width }}" height="{{ post.media.height }}" controls loop autoplay><a href={{ post.media.url }}>Video</a></video>
|
||||
{% else if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "video" %}
|
||||
<video class="post_media_video short" src="{{ post.media.url }}" width="{{ post.media.width }}px" height="{{ post.media.height }}px" poster="{{ post.media.poster }}" preload="none" controls autoplay><a href={{ post.media.url }}>Video</a></video>
|
||||
{% if prefs.use_hls == "on" && !post.media.alt_url.is_empty() %}
|
||||
<video class="post_media_video short" width="{{ post.media.width }}" height="{{ post.media.height }}" poster="{{ post.media.poster }}" controls preload="none">
|
||||
<source src="{{ post.media.alt_url }}" type="application/vnd.apple.mpegurl" />
|
||||
<source src="{{ post.media.url }}" type="video/mp4" />
|
||||
</video>
|
||||
{% else %}
|
||||
<video class="post_media_video short" src="{{ post.media.url }}" width="{{ post.media.width }}" height="{{ post.media.height }}" poster="{{ post.media.poster }}" preload="none" controls autoplay><a href={{ post.media.url }}>Video</a></video>
|
||||
{% call render_hls_notification(format!("{}%23{}", &self.url[1..].replace("&", "%26"), post.id)) %}
|
||||
{% endif %}
|
||||
{% else if post.post_type != "self" %}
|
||||
<a class="post_thumbnail {% if post.thumbnail.url.is_empty() %}no_thumbnail{% endif %}" href="{% if post.post_type == "link" %}{{ post.media.url }}{% else %}{{ post.permalink }}{% endif %}">
|
||||
<a class="post_thumbnail {% if post.thumbnail.url.is_empty() %}no_thumbnail{% endif %}" href="{% if post.post_type == "link" %}{{ post.media.url }}{% else %}{{ post.permalink }}{% endif %}" rel="nofollow">
|
||||
{% if post.thumbnail.url.is_empty() %}
|
||||
<svg viewBox="0 0 100 106" width="140" height="53" xmlns="http://www.w3.org/2000/svg">
|
||||
<title>Thumbnail</title>
|
||||
@ -108,9 +129,9 @@
|
||||
</a>
|
||||
{% endif %}
|
||||
|
||||
<div class="post_score">{{ post.score }}<span class="label"> Upvotes</span></div>
|
||||
<div class="post_score" title="{{ post.score.1 }}">{{ post.score.0 }}<span class="label"> Upvotes</span></div>
|
||||
<div class="post_footer">
|
||||
<a href="{{ post.permalink }}" class="post_comments">{{ post.comments }} comments</a>
|
||||
<a href="{{ post.permalink }}" class="post_comments" title="{{ post.comments.1 }} comments">{{ post.comments.0 }} comments</a>
|
||||
</div>
|
||||
</div>
|
||||
{%- endmacro %}
|
||||
|
Reference in New Issue
Block a user