Compare commits

...

100 Commits

Author SHA1 Message Date
feedc572cd change bin name 2024-11-03 10:36:46 +13:00
a3bc16f7d2 port update checker to gitea compatible version 2024-11-03 10:08:26 +13:00
bd4cb96c0f "fix" scraper 2024-11-03 09:36:11 +13:00
a9c99cc752 Merge remote-tracking branch 'upstream/main' 2024-11-03 09:34:12 +13:00
Matthew Esposito
f03bdcf472
feat: display whether or not the instance is up to date on error (#310) 2024-11-01 18:16:25 -04:00
Matthew Esposito
2fd358f3ed
feat(hls): add video quality preference (#306) 2024-11-01 12:28:52 -04:00
Matthew Esposito
5ef57812f8 style: fix clippy 2024-11-01 11:39:05 -04:00
Nolan Poe
d17d097b12
Fix parts of CI (#304)
* Run cargo fmt, hide clippy::cmp_owned errors

* Bump deps

* Fix failing test

* Update src/client.rs

---------

Co-authored-by: Matthew Esposito <matt@matthew.science>
2024-10-31 22:50:50 -04:00
Alex
a96894c743
enables http2 crate feature, replaces http1 protocol with http2 on co… (#305) 2024-10-31 22:48:19 -04:00
Matthew Esposito
9aea9c90a2 fix: reduce to minimum patch, fix clippy 2024-10-31 16:09:35 -04:00
Matthew Esposito
efdf1848ac fix: emergency patch for 403 2024-10-31 16:06:29 -04:00
Matthew Esposito
bc9530821d feat(scraper): add output file 2024-10-30 15:15:38 -04:00
Matthew Esposito
f3d2f0cc59 feat(scraper): add scraper CLI 2024-10-21 20:54:05 -04:00
Matthew Esposito
49ef59e000 chore: make library 2024-10-21 20:46:03 -04:00
6773e3756b Update README.md 2024-10-20 21:16:03 +13:00
1c1e627815 v0.35.3 2024-10-20 19:31:07 +13:00
4b854eb84d fix/change feed list alignment 2024-10-20 19:20:01 +13:00
712790acbe fix footer spacing 2024-10-20 18:08:26 +13:00
062d810aad settings rearrangement 2024-10-20 18:06:26 +13:00
64eec64ebe Merge remote-tracking branch 'upstream/main' 2024-10-20 17:32:13 +13:00
DokterKaj
3ff907d6c1
additional new colour tweaks (#285) 2024-10-12 11:42:15 -04:00
Ben Sherman
f4a457e529
Add additional themes to README (#284) 2024-10-11 15:43:45 -04:00
DokterKaj
7dda8d9bbb
use better accent colour + add libreddit styles (#281)
* update favicon with new logo

* only have 32x32 .ico file

* use #d74253 as new accent colour + add old libreddit styles + bolden accented buttons

* fix unrenamed libreddit themes
2024-10-10 18:45:39 -04:00
DokterKaj
b99412b4a1
feat: update logos and accents (#280)
* feat: update logos, accents

* fix apple-touch-icon.png size

* remove width, length

---------

Co-authored-by: DokterKaj <dokterkaj@gmail.com>
2024-10-06 15:19:37 -04:00
Guillaume Gomez
1838fdaea4
Replace askama with rinja (#276) 2024-10-02 17:43:13 -04:00
Ben Beasley
f71b0cd178
chore(deps): Update brotli from 6.0 to 7.0 (#277)
* chore(deps): Update brotli from 6.0 to 7.0

* Update Cargo.lock for brotli 7.0
2024-10-02 14:18:41 -04:00
Guanran Wang
604db902e9
Fix systemd service (#275)
This format is not recognized by systemd. As shown in the following log:

/etc/systemd/system/redlib.service:33: System call ~@privileged is not known, ignoring.
/etc/systemd/system/redlib.service:33: System call ~@resources is not known, ignoring.
2024-10-01 15:55:42 -04:00
Matthew Esposito
e57eaa0b78 fix(issue): render checkbox in issue template 2024-09-29 22:31:23 -04:00
Matthew Esposito
31ad8c5f7b fix(issue): add checkbox for latest commit 2024-09-29 16:29:38 -04:00
mdnghtman
5aef97410c
Update redlib.conf (#271)
This setting expects an array of subs and not a boolean value. 
This confuses new users and also seems to be unintentional. 

Removing the comment character would lead to an error output on the start page. “Couldn't send request to Reddit: Post url contains non-ASCII characters | /r/off (sub1%2Bsub2%2Bsub3)/hot.json?&raw_json=1”
2024-09-29 14:45:03 -04:00
DokterKaj
8d0ed4682e
feat(search): redirect u/ and user/ to profile (#268) 2024-09-27 08:29:33 -04:00
Butter Cat
fe4fed0504
Make jump to comment work from user's page and make last <p> in the post and comment bodies get the margin above it like the others (#219)
* Make last <p> in post body get padding above it

* When going to a comment from a user's page jump to it on page load
2024-09-27 00:44:32 -04:00
Matthew Esposito
6e2e679a0e chore(oauth): add additional logging to login routine 2024-09-26 15:06:39 -04:00
Matthew Esposito
6b44c1abf2 chore(oauth): add additional logging to login routine 2024-09-26 15:04:36 -04:00
Butter Cat
a807002ddf
Fix #206 and make (most) emotes embed in comments (#209)
* Fix links not being converted when multiple emojis are in one comment

* Make (most) emotes embed within comments

* Restore the behavior that the "rewrite_urls_removes_backslashes_and_rewrites_url" test looks for

* Listen to cargo fmt and cargo clippy's suggestions as well as removing some leftover comments and code

---------

Co-authored-by: Matthew Esposito <matt@matthew.science>
2024-09-25 13:36:23 -04:00
Matthew Esposito
403513ac4c
fix(search): handle queries' urlencoding (#264)
* fix(search): handle queries' urlencoding

* fix(search): handle queries' urlencoding
2024-09-24 23:30:06 -04:00
Matthew Esposito
72f7d9d08c
fix(search): handle multi-sub search (#263) 2024-09-24 23:20:12 -04:00
Matthew Esposito
e6273e2ed5
fix(client): catch json suspended user error (#262)
* fix(client): catch json suspended user error
2024-09-24 23:13:36 -04:00
Matthew Esposito
f1d4e6a417
fix(client): catch various json errors to properly render error page (#261)
* fix(client): catch various json errors to properly render error page

* fix(client): catch various json errors to properly render error page
2024-09-24 23:01:28 -04:00
Matthew Esposito
e0d7837c02
fix(client): don't catch network policy errors, since they indicate q… (#259) 2024-09-24 21:45:47 -04:00
Matthew Esposito
2d6ac78acf
chore(client): update new oauth path (#258) 2024-09-24 21:28:54 -04:00
Matthew Esposito
1e54c639d3
fix(client): add a timeout and retry logic to oauth daemon (#256)
* fix(client): add a timeout and retry logic to oauth daemon

* fix(client): add a timeout and retry logic to oauth daemon
2024-09-24 21:02:12 -04:00
Matthew Esposito
d5f137ce47 fix(funding): add sponsor link 2024-09-21 15:48:19 -04:00
Matthew Esposito
245fd9d408 fix(funding): update funding 2024-09-21 15:47:44 -04:00
Matthew Esposito
b54620b5aa fix(client): use async_recursion crate 2024-09-21 15:44:27 -04:00
freedit-dev
69c7a69afd
add description for rss item (just like https://news.ycombinator.com/… (#220)
fix #201
2024-09-21 00:05:32 -04:00
Butter Cat
2991813c2d
Make poll results appear inside of a post (#218) 2024-09-21 00:02:34 -04:00
Matthew Esposito
7156be6ad0 fix(client): fix failing tests, retries for canonical_path 2024-09-20 23:57:18 -04:00
f9b1a832c4 .dockerignore 2024-09-19 14:17:02 +12:00
1ffbce1ad8 Merge remote-tracking branch 'upstream/main' 2024-09-19 14:14:17 +12:00
Matthew Esposito
793047f63f fix(client): revert to hyper-rustls=0.24.2 2024-09-18 11:24:00 -04:00
Matthew Esposito
3625fdfdbe fix(ci): temporarily disable README updates 2024-09-17 14:42:09 -04:00
Matthew Esposito
28f85f2599 Attempting to fix main-docker.yml for downloading digests 2024-09-17 14:39:42 -04:00
wuchyi
7be29f609c
Attempting to fix main-docker.yml for downloading digests (#243)
* Update main-docker.yml (digests)

Updated digest name on upload (as per 1187084)

* Update main-docker.yml (digests download)

Trying another fix based on the template provided here https://github.com/actions/download-artifact for downloading multiple (filtered) Artifacts to the same directory
2024-09-17 14:37:39 -04:00
wuchyi
f18d135045
Update main-docker.yml (digests) (#238)
Updated digest name on upload (as per 1187084)
2024-09-17 14:09:32 -04:00
f2c410f9ea Merge remote-tracking branch 'upstream/main' 2024-09-17 12:28:21 +12:00
Matthew Esposito
8ef45456d6 actions trigger 2024-09-16 16:39:07 -04:00
Matthew Esposito
118708400a fix(ci): unique name 2024-09-16 16:36:24 -04:00
Matthew Esposito
28e72c9058
fix(ci): bump versions (#234) 2024-09-16 16:29:52 -04:00
Matthew Esposito
0b15250cc8
fix(oauth): catch network policy violation and rate limit (#233) 2024-09-16 16:16:08 -04:00
Matthew Esposito
408ebe6ef1
feat(post): add archive.is link for link posts (#165) 2024-09-05 12:00:09 -04:00
Pim
7a0ea1fbd3
fix: spoiler hover and video size (#196)
* fix: unblur both media as body when hoevering over either one

* fix: set min size for video

when the video is not loaded, the size is determined by the poster image. But when the poster image returns a 404, then the video had a size of 0x0

* chore: tab/space
2024-09-05 11:59:43 -04:00
Butter Cat
db8b92ea55
Fix a whole bunch of styling bugs (#193)
* Fix a whole bunch of mobile styling bugs

* Make searchbox scroll fix only apply in mobile mode to prevent bug

* Remove the min-width requirement for the main column

This was meant to be removed already, this is what fixes posts having an odd right side gap before swapping to the mobile layout

* Make margins consistent between fixed and unfixed navbar settings

* Remove some empty space from deleted option

* Make mobile layout post width fix only apply in mobile mode to prevent bug

* Make sure some options only get applied to the elements that need them, also fix the margins on the settings page

* Move search comments option before it starts touching the sort options and wrapping the x amount of comments text

* Trigger the even further compacted layout a little earlier, right before text begins wrapping in odd ways

* In the extra small mobile layout make give up/downvote numbers enough room so they aren't clipping out of their box

* Fix https://github.com/redlib-org/redlib/issues/172

* Properly center search box instead of having it slightly skewed

* Undo word wrapping since it breaks the sorting options and the only other viable setting has an absolute conniption on Chrome for some reason

* Readd word wrapping and just force it to normal for the sorting section

* Make post flair line up with title

* Make post flair position consistent

* Make footer text properly horizontally centered in mobile mode and fix slight vertical misalignment issues

* Make feeds button appear in settings menu to keep navbar looking consistent

* Fix extra navbar padding on search page

* Reduce gap between navbar and content in mobile mode

* Reduce gap between navbar and content in mobile mode
2024-09-05 11:59:21 -04:00
Kot C
c494fbec31
Insert noindex meta for ROBOTS_DISABLE_INDEXING (#199) (#207) 2024-09-03 19:44:04 -04:00
Butter Cat
438e412be3
Add anchor to comment link (#212) 2024-09-03 19:21:33 -04:00
dependabot[bot]
d0e081e6a0
chore(deps): bump actions/download-artifact from 3 to 4.1.7 in /.github/workflows (#214)
Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 3 to 4.1.7.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v3...v4.1.7)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-03 19:21:11 -04:00
b076b076e3 v0.35.2 2024-08-31 10:20:57 +12:00
fdaba8344d set video download MIME
fixes #11
2024-08-31 10:18:05 +12:00
Davide Cavalca
041ecceeaf
Update license tag (#200) 2024-08-07 14:09:54 -04:00
Butter Cat
3677ca10e2
Fix sort options overflow on small screens (#192) 2024-07-25 20:26:27 -04:00
Butter Cat
21fd34710c
Fix font sizes in search bar being incorrect (#190) 2024-07-24 19:56:06 -04:00
Jere Vuola
4205959ade
fix(docs): Add env HIDE_SIDEBAR_AND_SUMMARY (#188) 2024-07-23 21:02:15 -04:00
Matthew Esposito
27b56c1781
fix(client): handle new gated and quarantined error types (#187)
* fix(client): handle new gated and quarantined error types

* test(client): add test for gated and quarantined
2024-07-21 14:22:54 -04:00
Matthew Esposito
374238abc3
Add RSS feeds (fix #57) (#90)
* Add RSS feeds

* feat(rss): feature-ify rss

* feat(rss): config-ify rss

* fix(rss): update info page

* feat(rss): conditionally add RSS feeds to user and sub pages

* feat(rss): implement URLs for RSS
2024-07-21 14:09:34 -04:00
Pim
9bdb5c8966
feat: also blur text in post body for spoilers (#186)
chore: simplify blur classes
2024-07-21 10:22:40 -04:00
syeopite
410872d988
Make search bar responsive on mobile devices (#178)
* Search: Apply bg on elements rather than container

This changes allows moving the individual elements that composes
the search bar around without losing the background on the elements.

* Update search widget semantic structure

* Make search bar design responsive on small screens

* Fix border color

* Polish
2024-07-17 21:28:50 -04:00
syeopite
c890e809b7
Improve post information widget of user comments for devices under 480px width (#183)
* Fix user comment post link disappearing when < 480px

* Improve user comment metadata design for mobile

* Remove formerly unused CSS class style

The prior commit introduced the usage of the `comment_subreddit` class
to identify the subreddit that the reddit user posted the comment on.

However, this class came with a legacy style within the CSS file that was
previously not used anywhere within the current day Redlib

As such this style has been removed.
2024-07-17 21:27:13 -04:00
eedbe06af6 Merge branch 'ugly-layouts'
sure why not :)
- last words 2024
2024-07-07 19:34:27 +12:00
6042bca612 mosaic -> waterfall 2024-07-07 18:35:53 +12:00
7e85ae25b1 Merge remote-tracking branch 'upstream/main'
+ repository link update
2024-07-07 17:19:28 +12:00
Pim
4f21388643
fix: also use hls if possible for gifs in post_in_list macro (#177) 2024-07-05 16:33:06 -04:00
Pim
8a917fcde3
feat: add download button on image/gif/video posts (#173)
* feat: add download button on image/gif/video posts

* chore: fix formatting

* chore: dont create reference
2024-07-04 21:32:12 -04:00
Matthew Esposito
67a890cab3
fix(posts): fix sort call on new (#171) 2024-07-02 08:04:27 -04:00
Pim
366bc17f97
feat: show post link title for comments on user page (#169) 2024-07-01 17:15:50 -04:00
Matthew Esposito
d9e7681004 v0.35.1 2024-06-29 13:28:18 -04:00
Matthew Esposito
f74d1affb6
fix(posts): manually sort by flags (#168)
* fix(posts): manually sort by flags

* fix(posts): shorten sort call
2024-06-29 13:26:09 -04:00
Matthew Esposito
f44638a2cb v0.35.0 2024-06-29 12:00:34 -04:00
Matthew Esposito
beb4cf193b
fix(posts): manually sort by created date (#166) 2024-06-29 11:48:42 -04:00
Matthew Esposito
c565ebfb01 refactor(log): update some logs 2024-06-29 10:44:33 -04:00
2b8112a3fb mosaic layout 2024-06-30 00:04:46 +12:00
Matthew Esposito
459a8e1245 refactor(log): shorten some logs 2024-06-29 00:20:19 -04:00
Matthew Esposito
0f7eba717e fix(client): Handle invalid reddit response of base URL location 2024-06-28 22:41:36 -04:00
Matthew Esposito
ea87ec33a1
fix(subreddit): handle plus-encoding errors even better (#163)
* fix(subreddit): handle plus-encoding errors even better

* chore(clippy): fix lint
2024-06-28 22:28:58 -04:00
Matthew Esposito
102cd2f23f
Merge pull request #162 from redlib-org/oauth_arc_swap
fix(oauth): arc_swap
2024-06-28 18:17:00 -04:00
Matthew Esposito
3b2ad212d5 fix(oauth): arc_swap 2024-06-28 18:14:47 -04:00
Matthew Esposito
4dc7ff8165
Merge pull request #160 from redlib-org/oauth_oppenheimer
fix(oauth): even more atomics to avoid simultaneous token rollover
2024-06-27 23:35:51 -04:00
Matthew Esposito
2f8a38d8c7 chore(clippy): fix lint 2024-06-27 23:34:27 -04:00
Matthew Esposito
13083e999c fix(oauth): handle extremely rare race condition by atomically compare_exchanging 2024-06-27 23:32:17 -04:00
Matthew Esposito
4e2ec3fbc9 fix(oauth): handle case where a rate limit sneaks in 2024-06-27 23:29:55 -04:00
Matthew Esposito
89313f73e6 fix(oauth): atomics to avoid simultaneous token rollover 2024-06-27 23:26:31 -04:00
43 changed files with 3162 additions and 1760 deletions

1
.dockerignore Normal file
View File

@ -0,0 +1 @@
target

5
.github/FUNDING.yml vendored
View File

@ -1,2 +1,3 @@
liberapay: spike liberapay: sigaloid
custom: ['https://www.buymeacoffee.com/spikecodes'] buy_me_a_coffee: sigaloid
github: sigaloid

View File

@ -7,6 +7,10 @@ assignees: ''
--- ---
<!--
BEFORE FILING A BUG REPORT: Ensure that you are running the latest git commit. Visit /info on your instance, and ensure the git commit listed is the same commit listed on the home page.
-->
## Describe the bug ## Describe the bug
<!-- <!--
A clear and concise description of what the bug is. A clear and concise description of what the bug is.
@ -31,3 +35,7 @@ Steps to reproduce the behavior:
<!-- <!--
Add any other context about the problem here. Add any other context about the problem here.
--> -->
<!-- Mandatory -->
- [ ] I checked that the instance that this was reported on is running the latest git commit, or I can reproduce it locally on the latest git commit

View File

@ -15,15 +15,13 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
include: include:
- { platform: linux/amd64, target: x86_64-unknown-linux-musl} - { platform: linux/amd64, target: x86_64-unknown-linux-musl }
- { platform: linux/arm64, target: aarch64-unknown-linux-musl} - { platform: linux/arm64, target: aarch64-unknown-linux-musl }
- { platform: linux/arm/v7, target: armv7-unknown-linux-musleabihf} - { platform: linux/arm/v7, target: armv7-unknown-linux-musleabihf }
steps: steps:
- - name: Checkout
name: Checkout
uses: actions/checkout@v4 uses: actions/checkout@v4
- - name: Docker meta
name: Docker meta
id: meta id: meta
uses: docker/metadata-action@v5 uses: docker/metadata-action@v5
with: with:
@ -31,21 +29,17 @@ jobs:
tags: | tags: |
type=sha type=sha
type=raw,value=latest,enable={{is_default_branch}} type=raw,value=latest,enable={{is_default_branch}}
- - name: Set up QEMU
name: Set up QEMU
uses: docker/setup-qemu-action@v3 uses: docker/setup-qemu-action@v3
- - name: Set up Docker Buildx
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- - name: Login to Quay.io Container Registry
name: Login to Quay.io Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
registry: quay.io registry: quay.io
username: ${{ secrets.QUAY_USERNAME }} username: ${{ secrets.QUAY_USERNAME }}
password: ${{ secrets.QUAY_ROBOT_TOKEN }} password: ${{ secrets.QUAY_ROBOT_TOKEN }}
- - name: Build and push
name: Build and push
id: build id: build
uses: docker/build-push-action@v5 uses: docker/build-push-action@v5
with: with:
@ -55,17 +49,15 @@ jobs:
outputs: type=image,name=${{ env.REGISTRY_IMAGE }},push-by-digest=true,name-canonical=true,push=true outputs: type=image,name=${{ env.REGISTRY_IMAGE }},push-by-digest=true,name-canonical=true,push=true
file: Dockerfile file: Dockerfile
build-args: TARGET=${{ matrix.target }} build-args: TARGET=${{ matrix.target }}
- - name: Export digest
name: Export digest
run: | run: |
mkdir -p /tmp/digests mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}" digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}" touch "/tmp/digests/${digest#sha256:}"
- - name: Upload digest
name: Upload digest uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v3
with: with:
name: digests name: digests-${{ matrix.target }}
path: /tmp/digests/* path: /tmp/digests/*
if-no-files-found: error if-no-files-found: error
retention-days: 1 retention-days: 1
@ -74,17 +66,16 @@ jobs:
needs: needs:
- build - build
steps: steps:
- - name: Download digests
name: Download digests uses: actions/download-artifact@v4.1.7
uses: actions/download-artifact@v3
with: with:
name: digests
path: /tmp/digests path: /tmp/digests
- pattern: digests-*
name: Set up Docker Buildx merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- - name: Docker meta
name: Docker meta
id: meta id: meta
uses: docker/metadata-action@v5 uses: docker/metadata-action@v5
with: with:
@ -92,31 +83,27 @@ jobs:
tags: | tags: |
type=sha type=sha
type=raw,value=latest,enable={{is_default_branch}} type=raw,value=latest,enable={{is_default_branch}}
- - name: Login to Quay.io Container Registry
name: Login to Quay.io Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
registry: quay.io registry: quay.io
username: ${{ secrets.QUAY_USERNAME }} username: ${{ secrets.QUAY_USERNAME }}
password: ${{ secrets.QUAY_ROBOT_TOKEN }} password: ${{ secrets.QUAY_ROBOT_TOKEN }}
- - name: Create manifest list and push
name: Create manifest list and push
working-directory: /tmp/digests working-directory: /tmp/digests
run: | run: |
docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \ docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.REGISTRY_IMAGE }}@sha256:%s ' *) $(printf '${{ env.REGISTRY_IMAGE }}@sha256:%s ' *)
- name: Push README to Quay.io # - name: Push README to Quay.io
uses: christian-korneck/update-container-description-action@v1 # uses: christian-korneck/update-container-description-action@v1
env: # env:
DOCKER_APIKEY: ${{ secrets.APIKEY__QUAY_IO }} # DOCKER_APIKEY: ${{ secrets.APIKEY__QUAY_IO }}
with: # with:
destination_container_repo: quay.io/redlib/redlib # destination_container_repo: quay.io/redlib/redlib
provider: quay # provider: quay
readme_file: 'README.md' # readme_file: 'README.md'
- - name: Inspect image
name: Inspect image
run: | run: |
docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:${{ steps.meta.outputs.version }} docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:${{ steps.meta.outputs.version }}

View File

@ -56,7 +56,7 @@ jobs:
- name: Calculate SHA256 checksum - name: Calculate SHA256 checksum
run: sha256sum target/x86_64-unknown-linux-musl/release/redlib > redlib.sha256 run: sha256sum target/x86_64-unknown-linux-musl/release/redlib > redlib.sha256
- uses: actions/upload-artifact@v3 - uses: actions/upload-artifact@v4
name: Upload a Build Artifact name: Upload a Build Artifact
with: with:
name: redlib name: redlib

880
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,28 +1,30 @@
[package] [package]
name = "redsunlib" name = "redsunlib"
description = " Alternative private front-end to Reddit" description = " Alternative private front-end to Reddit"
license = "AGPL-3.0" license = "AGPL-3.0-only"
repository = "https://git.stardust.wtf/iridium/redlib" repository = "https://git.stardust.wtf/iridium/redsunlib"
version = "0.35.1" version = "0.35.3"
authors = [ authors = [
"Matthew Esposito <matt+cargo@matthew.science>", "Matthew Esposito <matt+cargo@matthew.science>",
"spikecodes <19519553+spikecodes@users.noreply.github.com>", "spikecodes <19519553+spikecodes@users.noreply.github.com>",
] ]
edition = "2021" edition = "2021"
default-run = "redsunlib"
[dependencies] [dependencies]
askama = { version = "0.12.1", default-features = false } rinja = { version = "0.3.4", default-features = false }
cached = { version = "0.51.3", features = ["async"] } cached = { version = "0.51.3", features = ["async"] }
clap = { version = "4.4.11", default-features = false, features = [ clap = { version = "4.4.11", default-features = false, features = [
"std", "std",
"env", "env",
"derive",
] } ] }
regex = "1.10.2" regex = "1.10.2"
serde = { version = "1.0.193", features = ["derive"] } serde = { version = "1.0.193", features = ["derive"] }
cookie = "0.18.0" cookie = "0.18.0"
futures-lite = "2.2.0" futures-lite = "2.2.0"
hyper = { version = "0.14.28", features = ["full"] } hyper = { version = "0.14.28", features = ["full"] }
hyper-rustls = "0.25.0" hyper-rustls = { version = "0.24.2", features = [ "http2" ] }
percent-encoding = "2.3.1" percent-encoding = "2.3.1"
route-recognizer = "0.3.1" route-recognizer = "0.3.1"
serde_json = "1.0.108" serde_json = "1.0.108"
@ -31,7 +33,7 @@ time = { version = "0.3.31", features = ["local-offset"] }
url = "2.5.0" url = "2.5.0"
rust-embed = { version = "8.1.0", features = ["include-exclude"] } rust-embed = { version = "8.1.0", features = ["include-exclude"] }
libflate = "2.0.0" libflate = "2.0.0"
brotli = { version = "6.0.0", features = ["std"] } brotli = { version = "7.0.0", features = ["std"] }
toml = "0.8.8" toml = "0.8.8"
once_cell = "1.19.0" once_cell = "1.19.0"
serde_yaml = "0.9.29" serde_yaml = "0.9.29"
@ -42,6 +44,11 @@ fastrand = "2.0.1"
log = "0.4.20" log = "0.4.20"
pretty_env_logger = "0.5.0" pretty_env_logger = "0.5.0"
dotenvy = "0.15.7" dotenvy = "0.15.7"
rss = "2.0.7"
arc-swap = "1.7.1"
serde_json_path = "0.6.7"
async-recursion = "1.1.1"
[dev-dependencies] [dev-dependencies]
lipsum = "0.9.0" lipsum = "0.9.0"
@ -51,3 +58,11 @@ sealed_test = "1.0.0"
codegen-units = 1 codegen-units = 1
lto = true lto = true
strip = "symbols" strip = "symbols"
[[bin]]
name = "redsunlib"
path = "src/main.rs"
[[bin]]
name = "scraper"
path = "src/scraper/main.rs"

View File

@ -64,14 +64,13 @@ Redlib currently implements most of Reddit's (signed-out) functionalities but st
**Red sun** in the sky + Red**lib** = Redsunlib **Red sun** in the sky + Red**lib** = Redsunlib
And at the time, I was reading an excerpt from Mao Zedong, so the name seemed appropriate. But paradoxically named since Reddit is basically the sinophobia capital of the internet :/ <sup>I do self criticism constantly, because I'm trapped in a Maoist *cult* where comrades (white terrorists) criticize me merciloussly for having a fascist credit card (VISA Silver Signature Rewards) They won't let me order vegan pizza anymore because the phone is fascist and "summoning my pizza slave with bourgeois app" is "bad vibes"</sup>
## Built with ## Built with
- [Rust](https://www.rust-lang.org/) - Programming language - [Rust](https://www.rust-lang.org/) - Programming language
- [Hyper](https://github.com/hyperium/hyper) - HTTP server and client - [Hyper](https://github.com/hyperium/hyper) - HTTP server and client
- [Askama](https://github.com/djc/askama) - Templating engine - [Rinja](https://github.com/rinja-rs/rinja) - Templating engine
- [Rustls](https://github.com/rustls/rustls) - TLS library - [Rustls](https://github.com/rustls/rustls) - TLS library
## How is it different from other Reddit front ends? ## How is it different from other Reddit front ends?
@ -312,17 +311,18 @@ Assign a default value for each instance-specific setting by passing environment
| `ROBOTS_DISABLE_INDEXING` | `["on", "off"]` | `off` | Disables indexing of the instance by search engines. | | `ROBOTS_DISABLE_INDEXING` | `["on", "off"]` | `off` | Disables indexing of the instance by search engines. |
| `PUSHSHIFT_FRONTEND` | String | `undelete.pullpush.io` | Allows the server to set the Pushshift frontend to be used with "removed" links. | | `PUSHSHIFT_FRONTEND` | String | `undelete.pullpush.io` | Allows the server to set the Pushshift frontend to be used with "removed" links. |
| `PORT` | Integer 0-65535 | `8080` | The **internal** port Redlib listens on. | | `PORT` | Integer 0-65535 | `8080` | The **internal** port Redlib listens on. |
| `ENABLE_RSS` | `["on", "off"]` | `off` | Enables RSS feed generation. |
| `FULL_URL` | String | (empty) | Allows for proper URLs (for now, only needed by RSS)
## Default user settings ## Default user settings
Assign a default value for each user-modifiable setting by passing environment variables to Redlib in the format `REDLIB_DEFAULT_{Y}`. Replace `{Y}` with the setting name (see list below) in capital letters. Assign a default value for each user-modifiable setting by passing environment variables to Redlib in the format `REDLIB_DEFAULT_{Y}`. Replace `{Y}` with the setting name (see list below) in capital letters.
| Name | Possible values | Default value | | Name | Possible values | Default value |
| ----------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------- | ------------- | | ----------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------- | ------------- |
| `THEME` | `["system", "light", "dark", "black", "dracula", "nord", "laserwave", "violet", "gold", "rosebox", "gruvboxdark", "gruvboxlight", "tokyoNight", "icebergDark"]` | `system` | | `THEME` | `["system", "light", "dark", "black", "dracula", "nord", "laserwave", "violet", "gold", "rosebox", "gruvboxdark", "gruvboxlight", "tokyoNight", "catppuccin", "icebergDark", "doomone", "libredditBlack", "libredditDark", "libredditLight"]` | `system` |
| `MASCOT` | `["BoymoderBlahaj", "redsunlib" ... Add more at ./static/mascots] ` | _(none)_ | | `MASCOT` | `["BoymoderBlahaj", "redsunlib" ... Add more at ./static/mascots] ` | _(none)_ |
| `FRONT_PAGE` | `["default", "popular", "all"]` | `default` | | `FRONT_PAGE` | `["default", "popular", "all"]` | `default` |
| `LAYOUT` | `["card", "clean", "compact"]` | `card` | | `LAYOUT` | `["card", "clean", "compact", "old", "waterfall"]` | `card` |
| `WIDE` | `["on", "off"]` | `off` | | `WIDE` | `["on", "off"]` | `off` |
| `POST_SORT` | `["hot", "new", "top", "rising", "controversial"]` | `hot` | | `POST_SORT` | `["hot", "new", "top", "rising", "controversial"]` | `hot` |
| `COMMENT_SORT` | `["confidence", "top", "new", "controversial", "old"]` | `confidence` | | `COMMENT_SORT` | `["confidence", "top", "new", "controversial", "old"]` | `confidence` |
@ -337,4 +337,5 @@ Assign a default value for each user-modifiable setting by passing environment v
| `HIDE_AWARDS` | `["on", "off"]` | `off` | | `HIDE_AWARDS` | `["on", "off"]` | `off` |
| `DISABLE_VISIT_REDDIT_CONFIRMATION` | `["on", "off"]` | `off` | | `DISABLE_VISIT_REDDIT_CONFIRMATION` | `["on", "off"]` | `off` |
| `HIDE_SCORE` | `["on", "off"]` | `off` | | `HIDE_SCORE` | `["on", "off"]` | `off` |
| `HIDE_SIDEBAR_AND_SUMMARY` | `["on", "off"]` | `off` |
| `FIXED_NAVBAR` | `["on", "off"]` | `on` | | `FIXED_NAVBAR` | `["on", "off"]` | `on` |

View File

@ -62,7 +62,7 @@
"REDLIB_BANNER": { "REDLIB_BANNER": {
"required": false "required": false
}, },
"REDLIB_ROBOTS_DISABLE_INDEXING": { "REDLIB_ROBOTS_DISABLE_INDEXING": {
"required": false "required": false
}, },
"REDLIB_DEFAULT_SUBSCRIPTIONS": { "REDLIB_DEFAULT_SUBSCRIPTIONS": {
@ -76,6 +76,12 @@
}, },
"REDLIB_PUSHSHIFT_FRONTEND": { "REDLIB_PUSHSHIFT_FRONTEND": {
"required": false "required": false
},
"REDLIB_ENABLE_RSS": {
"required": false
},
"REDLIB_FULL_URL": {
"required": false
} }
} }
} }

View File

@ -14,6 +14,6 @@ PORT=12345
#REDLIB_DEFAULT_FFMPEG_VIDEO_DOWNLOADS=off #REDLIB_DEFAULT_FFMPEG_VIDEO_DOWNLOADS=off
#REDLIB_DEFAULT_HIDE_HLS_NOTIFICATION=off #REDLIB_DEFAULT_HIDE_HLS_NOTIFICATION=off
#REDLIB_DEFAULT_AUTOPLAY_VIDEOS=off #REDLIB_DEFAULT_AUTOPLAY_VIDEOS=off
#REDLIB_DEFAULT_SUBSCRIPTIONS=off (sub1+sub2+sub3) #REDLIB_DEFAULT_SUBSCRIPTIONS=(sub1+sub2+sub3)
#REDLIB_DEFAULT_HIDE_AWARDS=off #REDLIB_DEFAULT_HIDE_AWARDS=off
#REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION=off #REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION=off

View File

@ -30,7 +30,8 @@ RestrictNamespaces=yes
RestrictRealtime=yes RestrictRealtime=yes
RestrictSUIDSGID=yes RestrictSUIDSGID=yes
SystemCallArchitectures=native SystemCallArchitectures=native
SystemCallFilter=@system-service ~@privileged ~@resources SystemCallFilter=@system-service
SystemCallFilter=~@privileged @resources
UMask=0077 UMask=0077
[Install] [Install]

View File

@ -1,8 +1,10 @@
use arc_swap::ArcSwap;
use cached::proc_macro::cached; use cached::proc_macro::cached;
use futures_lite::future::block_on; use futures_lite::future::block_on;
use futures_lite::{future::Boxed, FutureExt}; use futures_lite::{future::Boxed, FutureExt};
use hyper::client::HttpConnector; use hyper::client::HttpConnector;
use hyper::{body, body::Buf, client, header, Body, Client, Method, Request, Response, Uri}; use hyper::header::HeaderValue;
use hyper::{body, body::Buf, header, Body, Client, Method, Request, Response, Uri};
use hyper_rustls::HttpsConnector; use hyper_rustls::HttpsConnector;
use libflate::gzip; use libflate::gzip;
use log::{error, trace, warn}; use log::{error, trace, warn};
@ -11,9 +13,8 @@ use percent_encoding::{percent_encode, CONTROLS};
use serde_json::Value; use serde_json::Value;
use std::sync::atomic::Ordering; use std::sync::atomic::Ordering;
use std::sync::atomic::{AtomicU16, Ordering::SeqCst}; use std::sync::atomic::{AtomicBool, AtomicU16};
use std::{io, result::Result}; use std::{io, result::Result};
use tokio::sync::RwLock;
use crate::dbg_msg; use crate::dbg_msg;
use crate::oauth::{force_refresh_token, token_daemon, Oauth}; use crate::oauth::{force_refresh_token, token_daemon, Oauth};
@ -21,25 +22,34 @@ use crate::server::RequestExt;
use crate::utils::format_url; use crate::utils::format_url;
const REDDIT_URL_BASE: &str = "https://oauth.reddit.com"; const REDDIT_URL_BASE: &str = "https://oauth.reddit.com";
const REDDIT_URL_BASE_HOST: &str = "oauth.reddit.com";
pub static CLIENT: Lazy<Client<HttpsConnector<HttpConnector>>> = Lazy::new(|| { const REDDIT_SHORT_URL_BASE: &str = "https://redd.it";
let https = hyper_rustls::HttpsConnectorBuilder::new() const REDDIT_SHORT_URL_BASE_HOST: &str = "redd.it";
.with_native_roots()
.expect("No native root certificates found")
.https_only()
.enable_http1()
.build();
client::Client::builder().build(https)
});
pub static OAUTH_CLIENT: Lazy<RwLock<Oauth>> = Lazy::new(|| { const ALTERNATIVE_REDDIT_URL_BASE: &str = "https://www.reddit.com";
const ALTERNATIVE_REDDIT_URL_BASE_HOST: &str = "www.reddit.com";
pub static HTTPS_CONNECTOR: Lazy<HttpsConnector<HttpConnector>> =
Lazy::new(|| hyper_rustls::HttpsConnectorBuilder::new().with_native_roots().https_only().enable_http2().build());
pub static CLIENT: Lazy<Client<HttpsConnector<HttpConnector>>> = Lazy::new(|| Client::builder().build::<_, Body>(HTTPS_CONNECTOR.clone()));
pub static OAUTH_CLIENT: Lazy<ArcSwap<Oauth>> = Lazy::new(|| {
let client = block_on(Oauth::new()); let client = block_on(Oauth::new());
tokio::spawn(token_daemon()); tokio::spawn(token_daemon());
RwLock::new(client) ArcSwap::new(client.into())
}); });
pub static OAUTH_RATELIMIT_REMAINING: AtomicU16 = AtomicU16::new(99); pub static OAUTH_RATELIMIT_REMAINING: AtomicU16 = AtomicU16::new(99);
pub static OAUTH_IS_ROLLING_OVER: AtomicBool = AtomicBool::new(false);
static URL_PAIRS: [(&str, &str); 2] = [
(ALTERNATIVE_REDDIT_URL_BASE, ALTERNATIVE_REDDIT_URL_BASE_HOST),
(REDDIT_SHORT_URL_BASE, REDDIT_SHORT_URL_BASE_HOST),
];
/// Gets the canonical path for a resource on Reddit. This is accomplished by /// Gets the canonical path for a resource on Reddit. This is accomplished by
/// making a `HEAD` request to Reddit at the path given in `path`. /// making a `HEAD` request to Reddit at the path given in `path`.
/// ///
@ -53,13 +63,32 @@ pub static OAUTH_RATELIMIT_REMAINING: AtomicU16 = AtomicU16::new(99);
/// `Location` header. An `Err(String)` is returned if Reddit responds with a /// `Location` header. An `Err(String)` is returned if Reddit responds with a
/// 429, or if we were unable to decode the value in the `Location` header. /// 429, or if we were unable to decode the value in the `Location` header.
#[cached(size = 1024, time = 600, result = true)] #[cached(size = 1024, time = 600, result = true)]
pub async fn canonical_path(path: String) -> Result<Option<String>, String> { #[async_recursion::async_recursion]
let res = reddit_head(path.clone(), true).await?; pub async fn canonical_path(path: String, tries: i8) -> Result<Option<String>, String> {
if tries == 0 {
return Ok(None);
}
// for each URL pair, try the HEAD request
let res = {
// for url base and host in URL_PAIRS, try reddit_short_head(path.clone(), true, url_base, url_base_host) and if it succeeds, set res. else, res = None
let mut res = None;
for (url_base, url_base_host) in URL_PAIRS {
res = reddit_short_head(path.clone(), true, url_base, url_base_host).await.ok();
if let Some(res) = &res {
if !res.status().is_client_error() {
break;
}
}
}
res
};
let res = res.ok_or_else(|| "Unable to make HEAD request to Reddit.".to_string())?;
let status = res.status().as_u16(); let status = res.status().as_u16();
let policy_error = res.headers().get(header::RETRY_AFTER).is_some();
match status { match status {
429 => Err("Too many requests.".to_string()),
// If Reddit responds with a 2xx, then the path is already canonical. // If Reddit responds with a 2xx, then the path is already canonical.
200..=299 => Ok(Some(path)), 200..=299 => Ok(Some(path)),
@ -69,6 +98,7 @@ pub async fn canonical_path(path: String) -> Result<Option<String>, String> {
let Ok(original) = val.to_str() else { let Ok(original) = val.to_str() else {
return Err("Unable to decode Location header.".to_string()); return Err("Unable to decode Location header.".to_string());
}; };
// We need to strip the .json suffix from the original path. // We need to strip the .json suffix from the original path.
// In addition, we want to remove share parameters. // In addition, we want to remove share parameters.
// Cut it off here instead of letting it propagate all the way // Cut it off here instead of letting it propagate all the way
@ -81,7 +111,9 @@ pub async fn canonical_path(path: String) -> Result<Option<String>, String> {
// also remove all Reddit domain parts with format_url. // also remove all Reddit domain parts with format_url.
// Otherwise, it will literally redirect to Reddit.com. // Otherwise, it will literally redirect to Reddit.com.
let uri = format_url(stripped_uri); let uri = format_url(stripped_uri);
Ok(Some(uri))
// Decrement tries and try again
canonical_path(uri, tries - 1).await
} }
None => Ok(None), None => Ok(None),
}, },
@ -90,6 +122,12 @@ pub async fn canonical_path(path: String) -> Result<Option<String>, String> {
// as above), return a None. // as above), return a None.
300..=399 => Ok(None), 300..=399 => Ok(None),
// Rate limiting
429 => Err("Too many requests.".to_string()),
// Special condition rate limiting - https://github.com/redlib-org/redlib/issues/229
403 if policy_error => Err("Too many requests.".to_string()),
_ => Ok( _ => Ok(
res res
.headers() .headers()
@ -116,7 +154,7 @@ async fn stream(url: &str, req: &Request<Body>) -> Result<Response<Body>, String
let parsed_uri = url.parse::<Uri>().map_err(|_| "Couldn't parse URL".to_string())?; let parsed_uri = url.parse::<Uri>().map_err(|_| "Couldn't parse URL".to_string())?;
// Build the hyper client from the HTTPS connector. // Build the hyper client from the HTTPS connector.
let client: Client<_, Body> = CLIENT.clone(); let client: &Lazy<Client<_, Body>> = &CLIENT;
let mut builder = Request::get(parsed_uri); let mut builder = Request::get(parsed_uri);
@ -156,26 +194,32 @@ async fn stream(url: &str, req: &Request<Body>) -> Result<Response<Body>, String
/// Makes a GET request to Reddit at `path`. By default, this will honor HTTP /// Makes a GET request to Reddit at `path`. By default, this will honor HTTP
/// 3xx codes Reddit returns and will automatically redirect. /// 3xx codes Reddit returns and will automatically redirect.
fn reddit_get(path: String, quarantine: bool) -> Boxed<Result<Response<Body>, String>> { fn reddit_get(path: String, quarantine: bool) -> Boxed<Result<Response<Body>, String>> {
request(&Method::GET, path, true, quarantine) request(&Method::GET, path, true, quarantine, REDDIT_URL_BASE, REDDIT_URL_BASE_HOST)
} }
/// Makes a HEAD request to Reddit at `path`. This will not follow redirects. /// Makes a HEAD request to Reddit at `path, using the short URL base. This will not follow redirects.
fn reddit_head(path: String, quarantine: bool) -> Boxed<Result<Response<Body>, String>> { fn reddit_short_head(path: String, quarantine: bool, base_path: &'static str, host: &'static str) -> Boxed<Result<Response<Body>, String>> {
request(&Method::HEAD, path, false, quarantine) request(&Method::HEAD, path, false, quarantine, base_path, host)
} }
// /// Makes a HEAD request to Reddit at `path`. This will not follow redirects.
// fn reddit_head(path: String, quarantine: bool) -> Boxed<Result<Response<Body>, String>> {
// request(&Method::HEAD, path, false, quarantine, false)
// }
// Unused - reddit_head is only ever called in the context of a short URL
/// Makes a request to Reddit. If `redirect` is `true`, `request_with_redirect` /// Makes a request to Reddit. If `redirect` is `true`, `request_with_redirect`
/// will recurse on the URL that Reddit provides in the Location HTTP header /// will recurse on the URL that Reddit provides in the Location HTTP header
/// in its response. /// in its response.
fn request(method: &'static Method, path: String, redirect: bool, quarantine: bool) -> Boxed<Result<Response<Body>, String>> { fn request(method: &'static Method, path: String, redirect: bool, quarantine: bool, base_path: &'static str, host: &'static str) -> Boxed<Result<Response<Body>, String>> {
// Build Reddit URL from path. // Build Reddit URL from path.
let url = format!("{REDDIT_URL_BASE}{path}"); let url = format!("{base_path}{path}");
// Construct the hyper client from the HTTPS connector. // Construct the hyper client from the HTTPS connector.
let client: Client<_, Body> = CLIENT.clone(); let client: &Lazy<Client<_, Body>> = &CLIENT;
let (token, vendor_id, device_id, user_agent, loid) = { let (token, vendor_id, device_id, user_agent, loid) = {
let client = block_on(OAUTH_CLIENT.read()); let client = OAUTH_CLIENT.load_full();
( (
client.token.clone(), client.token.clone(),
client.headers_map.get("Client-Vendor-Id").cloned().unwrap_or_default(), client.headers_map.get("Client-Vendor-Id").cloned().unwrap_or_default(),
@ -194,7 +238,7 @@ fn request(method: &'static Method, path: String, redirect: bool, quarantine: bo
.header("Client-Vendor-Id", vendor_id) .header("Client-Vendor-Id", vendor_id)
.header("X-Reddit-Device-Id", device_id) .header("X-Reddit-Device-Id", device_id)
.header("x-reddit-loid", loid) .header("x-reddit-loid", loid)
.header("Host", "oauth.reddit.com") .header("Host", host)
.header("Authorization", &format!("Bearer {token}")) .header("Authorization", &format!("Bearer {token}"))
.header("Accept-Encoding", if method == Method::GET { "gzip" } else { "identity" }) .header("Accept-Encoding", if method == Method::GET { "gzip" } else { "identity" })
.header("Accept-Language", "en-US,en;q=0.5") .header("Accept-Language", "en-US,en;q=0.5")
@ -219,12 +263,13 @@ fn request(method: &'static Method, path: String, redirect: bool, quarantine: bo
if !redirect { if !redirect {
return Ok(response); return Ok(response);
}; };
let location_header = response.headers().get(header::LOCATION);
if location_header == Some(&HeaderValue::from_static("https://www.reddit.com/")) {
return Err("Reddit response was invalid".to_string());
}
return request( return request(
method, method,
response location_header
.headers()
.get(header::LOCATION)
.map(|val| { .map(|val| {
// We need to make adjustments to the URI // We need to make adjustments to the URI
// we get back from Reddit. Namely, we // we get back from Reddit. Namely, we
@ -237,13 +282,19 @@ fn request(method: &'static Method, path: String, redirect: bool, quarantine: bo
// required. // required.
// //
// 2. Percent-encode the path. // 2. Percent-encode the path.
let new_path = percent_encode(val.as_bytes(), CONTROLS).to_string().trim_start_matches(REDDIT_URL_BASE).to_string(); let new_path = percent_encode(val.as_bytes(), CONTROLS)
.to_string()
.trim_start_matches(REDDIT_URL_BASE)
.trim_start_matches(ALTERNATIVE_REDDIT_URL_BASE)
.to_string();
format!("{new_path}{}raw_json=1", if new_path.contains('?') { "&" } else { "?" }) format!("{new_path}{}raw_json=1", if new_path.contains('?') { "&" } else { "?" })
}) })
.unwrap_or_default() .unwrap_or_default()
.to_string(), .to_string(),
true, true,
quarantine, quarantine,
base_path,
host,
) )
.await; .await;
}; };
@ -296,7 +347,7 @@ fn request(method: &'static Method, path: String, redirect: bool, quarantine: bo
} }
} }
Err(e) => { Err(e) => {
dbg_msg!("{} {}: {}", method, path, e); dbg_msg!("{method} {REDDIT_URL_BASE}{path}: {}", e);
Err(e.to_string()) Err(e.to_string())
} }
@ -318,36 +369,28 @@ pub async fn json(path: String, quarantine: bool) -> Result<Value, String> {
// First, handle rolling over the OAUTH_CLIENT if need be. // First, handle rolling over the OAUTH_CLIENT if need be.
let current_rate_limit = OAUTH_RATELIMIT_REMAINING.load(Ordering::SeqCst); let current_rate_limit = OAUTH_RATELIMIT_REMAINING.load(Ordering::SeqCst);
if current_rate_limit < 10 { let is_rolling_over = OAUTH_IS_ROLLING_OVER.load(Ordering::SeqCst);
if current_rate_limit < 10 && !is_rolling_over {
warn!("Rate limit {current_rate_limit} is low. Spawning force_refresh_token()"); warn!("Rate limit {current_rate_limit} is low. Spawning force_refresh_token()");
OAUTH_RATELIMIT_REMAINING.store(99, Ordering::SeqCst);
tokio::spawn(force_refresh_token()); tokio::spawn(force_refresh_token());
} }
OAUTH_RATELIMIT_REMAINING.fetch_sub(1, Ordering::SeqCst);
// Fetch the url... // Fetch the url...
match reddit_get(path.clone(), quarantine).await { match reddit_get(path.clone(), quarantine).await {
Ok(response) => { Ok(response) => {
let status = response.status(); let status = response.status();
// Ratelimit remaining let reset: Option<String> = if let (Some(remaining), Some(reset), Some(used)) = (
if let Some(Ok(remaining)) = response.headers().get("x-ratelimit-remaining").map(|val| val.to_str()) { response.headers().get("x-ratelimit-remaining").and_then(|val| val.to_str().ok().map(|s| s.to_string())),
trace!("Ratelimit remaining: {}", remaining); response.headers().get("x-ratelimit-reset").and_then(|val| val.to_str().ok().map(|s| s.to_string())),
if let Ok(remaining) = remaining.parse::<f32>().map(|f| f.round() as u16) { response.headers().get("x-ratelimit-used").and_then(|val| val.to_str().ok().map(|s| s.to_string())),
OAUTH_RATELIMIT_REMAINING.store(remaining, SeqCst); ) {
} else { trace!(
warn!("Failed to parse rate limit {remaining} from header."); "Ratelimit remaining: Header says {remaining}, we have {current_rate_limit}. Resets in {reset}. Rollover: {}. Ratelimit used: {used}",
} if is_rolling_over { "yes" } else { "no" },
} );
Some(reset)
// Ratelimit used
if let Some(Ok(used)) = response.headers().get("x-ratelimit-used").map(|val| val.to_str()) {
trace!("Ratelimit used: {}", used);
}
// Ratelimit reset
let reset = if let Some(Ok(reset)) = response.headers().get("x-ratelimit-reset").map(|val| val.to_str()) {
trace!("Ratelimit reset: {}", reset);
Some(reset.to_string())
} else { } else {
None None
}; };
@ -358,8 +401,13 @@ pub async fn json(path: String, quarantine: bool) -> Result<Value, String> {
let has_remaining = body.has_remaining(); let has_remaining = body.has_remaining();
if !has_remaining { if !has_remaining {
// Rate limited, so spawn a force_refresh_token()
tokio::spawn(force_refresh_token());
return match reset { return match reset {
Some(val) => Err(format!("Reddit rate limit exceeded. Will reset in: {val}")), Some(val) => Err(format!(
"Reddit rate limit exceeded. Try refreshing in a few seconds.\
Rate limit will reset in: {val}"
)),
None => Err("Reddit rate limit exceeded".to_string()), None => Err("Reddit rate limit exceeded".to_string()),
}; };
} }
@ -368,6 +416,16 @@ pub async fn json(path: String, quarantine: bool) -> Result<Value, String> {
match serde_json::from_reader(body.reader()) { match serde_json::from_reader(body.reader()) {
Ok(value) => { Ok(value) => {
let json: Value = value; let json: Value = value;
// If user is suspended
if let Some(data) = json.get("data") {
if let Some(is_suspended) = data.get("is_suspended").and_then(Value::as_bool) {
if is_suspended {
return Err("suspended".into());
}
}
}
// If Reddit returned an error // If Reddit returned an error
if json["error"].is_i64() { if json["error"].is_i64() {
// OAuth token has expired; http status 401 // OAuth token has expired; http status 401
@ -376,6 +434,24 @@ pub async fn json(path: String, quarantine: bool) -> Result<Value, String> {
let () = force_refresh_token().await; let () = force_refresh_token().await;
return Err("OAuth token has expired. Please refresh the page!".to_string()); return Err("OAuth token has expired. Please refresh the page!".to_string());
} }
// Handle quarantined
if json["reason"] == "quarantined" {
return Err("quarantined".into());
}
// Handle gated
if json["reason"] == "gated" {
return Err("gated".into());
}
// Handle private subs
if json["reason"] == "private" {
return Err("private".into());
}
// Handle banned subs
if json["reason"] == "banned" {
return Err("banned".into());
}
Err(format!("Reddit error {} \"{}\": {} | {path}", json["error"], json["reason"], json["message"])) Err(format!("Reddit error {} \"{}\": {} | {path}", json["error"], json["reason"], json["message"]))
} else { } else {
Ok(json) Ok(json)
@ -411,13 +487,34 @@ async fn test_localization_popular() {
async fn test_obfuscated_share_link() { async fn test_obfuscated_share_link() {
let share_link = "/r/rust/s/kPgq8WNHRK".into(); let share_link = "/r/rust/s/kPgq8WNHRK".into();
// Correct link without share parameters // Correct link without share parameters
let canonical_link = "/r/rust/comments/18t5968/why_use_tuple_struct_over_standard_struct/kfbqlbc".into(); let canonical_link = "/r/rust/comments/18t5968/why_use_tuple_struct_over_standard_struct/kfbqlbc/".into();
assert_eq!(canonical_path(share_link).await, Ok(Some(canonical_link))); assert_eq!(canonical_path(share_link, 3).await, Ok(Some(canonical_link)));
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_share_link_strip_json() { async fn test_share_link_strip_json() {
let link = "/17krzvz".into(); let link = "/17krzvz".into();
let canonical_link = "/r/nfl/comments/17krzvz/rapoport_sources_former_no_2_overall_pick/".into(); let canonical_link = "/comments/17krzvz".into();
assert_eq!(canonical_path(link).await, Ok(Some(canonical_link))); assert_eq!(canonical_path(link, 3).await, Ok(Some(canonical_link)));
}
#[tokio::test(flavor = "multi_thread")]
async fn test_private_sub() {
let link = json("/r/suicide/about.json?raw_json=1".into(), true).await;
assert!(link.is_err());
assert_eq!(link, Err("private".into()));
}
#[tokio::test(flavor = "multi_thread")]
async fn test_banned_sub() {
let link = json("/r/aaa/about.json?raw_json=1".into(), true).await;
assert!(link.is_err());
assert_eq!(link, Err("banned".into()));
}
#[tokio::test(flavor = "multi_thread")]
async fn test_gated_sub() {
// quarantine to false to specifically catch when we _don't_ catch it
let link = json("/r/drugs/about.json?raw_json=1".into(), false).await;
assert!(link.is_err());
assert_eq!(link, Err("gated".into()));
} }

View File

@ -111,6 +111,12 @@ pub struct Config {
#[serde(rename = "REDLIB_PUSHSHIFT_FRONTEND")] #[serde(rename = "REDLIB_PUSHSHIFT_FRONTEND")]
#[serde(alias = "LIBREDDIT_PUSHSHIFT_FRONTEND")] #[serde(alias = "LIBREDDIT_PUSHSHIFT_FRONTEND")]
pub(crate) pushshift: Option<String>, pub(crate) pushshift: Option<String>,
#[serde(rename = "REDLIB_ENABLE_RSS")]
pub(crate) enable_rss: Option<String>,
#[serde(rename = "REDLIB_FULL_URL")]
pub(crate) full_url: Option<String>,
} }
impl Config { impl Config {
@ -158,6 +164,8 @@ impl Config {
banner: parse("REDLIB_BANNER"), banner: parse("REDLIB_BANNER"),
robots_disable_indexing: parse("REDLIB_ROBOTS_DISABLE_INDEXING"), robots_disable_indexing: parse("REDLIB_ROBOTS_DISABLE_INDEXING"),
pushshift: parse("REDLIB_PUSHSHIFT_FRONTEND"), pushshift: parse("REDLIB_PUSHSHIFT_FRONTEND"),
enable_rss: parse("REDLIB_ENABLE_RSS"),
full_url: parse("REDLIB_FULL_URL"),
} }
} }
} }
@ -187,6 +195,8 @@ fn get_setting_from_config(name: &str, config: &Config) -> Option<String> {
"REDLIB_BANNER" => config.banner.clone(), "REDLIB_BANNER" => config.banner.clone(),
"REDLIB_ROBOTS_DISABLE_INDEXING" => config.robots_disable_indexing.clone(), "REDLIB_ROBOTS_DISABLE_INDEXING" => config.robots_disable_indexing.clone(),
"REDLIB_PUSHSHIFT_FRONTEND" => config.pushshift.clone(), "REDLIB_PUSHSHIFT_FRONTEND" => config.pushshift.clone(),
"REDLIB_ENABLE_RSS" => config.enable_rss.clone(),
"REDLIB_FULL_URL" => config.full_url.clone(),
_ => None, _ => None,
} }
} }

View File

@ -5,8 +5,8 @@ use crate::server::RequestExt;
use crate::subreddit::{can_access_quarantine, quarantine}; use crate::subreddit::{can_access_quarantine, quarantine};
use crate::utils::{error, filter_posts, get_filters, nsfw_landing, parse_post, template, Post, Preferences}; use crate::utils::{error, filter_posts, get_filters, nsfw_landing, parse_post, template, Post, Preferences};
use askama::Template;
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use rinja::Template;
use serde_json::Value; use serde_json::Value;
use std::borrow::ToOwned; use std::borrow::ToOwned;
use std::collections::HashSet; use std::collections::HashSet;

View File

@ -3,10 +3,10 @@ use crate::{
server::RequestExt, server::RequestExt,
utils::{ErrorTemplate, Preferences}, utils::{ErrorTemplate, Preferences},
}; };
use askama::Template;
use build_html::{Container, Html, HtmlContainer, Table}; use build_html::{Container, Html, HtmlContainer, Table};
use hyper::{http::Error, Body, Request, Response}; use hyper::{http::Error, Body, Request, Response};
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use rinja::Template;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use time::OffsetDateTime; use time::OffsetDateTime;
@ -85,7 +85,7 @@ fn info_html(req: &Request<Body>) -> Result<Response<Body>, Error> {
pub struct InstanceInfo { pub struct InstanceInfo {
package_name: String, package_name: String,
crate_version: String, crate_version: String,
git_commit: String, pub git_commit: String,
deploy_date: String, deploy_date: String,
compile_mode: String, compile_mode: String,
deploy_unix_ts: i64, deploy_unix_ts: i64,
@ -126,6 +126,8 @@ impl InstanceInfo {
["Compile mode", &self.compile_mode], ["Compile mode", &self.compile_mode],
["SFW only", &convert(&self.config.sfw_only)], ["SFW only", &convert(&self.config.sfw_only)],
["Pushshift frontend", &convert(&self.config.pushshift)], ["Pushshift frontend", &convert(&self.config.pushshift)],
["RSS enabled", &convert(&self.config.enable_rss)],
["Full URL", &convert(&self.config.full_url)],
//TODO: fallback to crate::config::DEFAULT_PUSHSHIFT_FRONTEND //TODO: fallback to crate::config::DEFAULT_PUSHSHIFT_FRONTEND
]) ])
.with_header_row(["Settings"]), .with_header_row(["Settings"]),
@ -167,6 +169,8 @@ impl InstanceInfo {
Compile mode: {}\n Compile mode: {}\n
SFW only: {:?}\n SFW only: {:?}\n
Pushshift frontend: {:?}\n Pushshift frontend: {:?}\n
RSS enabled: {:?}\n
Full URL: {:?}\n
Config:\n Config:\n
Banner: {:?}\n Banner: {:?}\n
Hide awards: {:?}\n Hide awards: {:?}\n
@ -193,6 +197,8 @@ impl InstanceInfo {
self.deploy_unix_ts, self.deploy_unix_ts,
self.compile_mode, self.compile_mode,
self.config.sfw_only, self.config.sfw_only,
self.config.enable_rss,
self.config.full_url,
self.config.pushshift, self.config.pushshift,
self.config.banner, self.config.banner,
self.config.default_hide_awards, self.config.default_hide_awards,

13
src/lib.rs Normal file
View File

@ -0,0 +1,13 @@
pub mod client;
pub mod config;
pub mod duplicates;
pub mod instance_info;
pub mod oauth;
pub mod oauth_resources;
pub mod post;
pub mod search;
pub mod server;
pub mod settings;
pub mod subreddit;
pub mod user;
pub mod utils;

View File

@ -2,35 +2,21 @@
#![forbid(unsafe_code)] #![forbid(unsafe_code)]
#![allow(clippy::cmp_owned)] #![allow(clippy::cmp_owned)]
// Reference local files use cached::proc_macro::cached;
mod config;
mod duplicates;
mod instance_info;
mod oauth;
mod oauth_resources;
mod post;
mod search;
mod settings;
mod subreddit;
mod user;
mod utils;
// Import Crates
use clap::{Arg, ArgAction, Command}; use clap::{Arg, ArgAction, Command};
use std::str::FromStr;
use futures_lite::FutureExt; use futures_lite::FutureExt;
use hyper::Uri;
use hyper::{header::HeaderValue, Body, Request, Response}; use hyper::{header::HeaderValue, Body, Request, Response};
mod client;
use client::{canonical_path, proxy};
use log::info; use log::info;
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use server::RequestExt; use redsunlib::client::{canonical_path, proxy, CLIENT};
use utils::{error, redirect, ThemeAssets, MascotAssets}; use redsunlib::server::{self, RequestExt};
use redsunlib::utils::{error, redirect, ThemeAssets, MascotAssets};
use redsunlib::{config, duplicates, headers, instance_info, post, search, settings, subreddit, user};
use crate::client::OAUTH_CLIENT; use redsunlib::client::OAUTH_CLIENT;
mod server;
// Create Services // Create Services
@ -257,6 +243,12 @@ async fn main() {
app app
.at("/highlighted.js") .at("/highlighted.js")
.get(|_| resource(include_str!("../static/highlighted.js"), "text/javascript", false).boxed()); .get(|_| resource(include_str!("../static/highlighted.js"), "text/javascript", false).boxed());
app
.at("/check_update.js")
.get(|_| resource(include_str!("../static/check_update.js"), "text/javascript", false).boxed());
app.at("/commits.json").get(|_| async move { proxy_commit_info().await }.boxed());
// FFmpeg // FFmpeg
app app
.at("/ffmpeg/814.ffmpeg.js") .at("/ffmpeg/814.ffmpeg.js")
@ -284,6 +276,9 @@ async fn main() {
app.at("/img/*path").get(|r| proxy(r, "https://i.redd.it/{path}").boxed()); app.at("/img/*path").get(|r| proxy(r, "https://i.redd.it/{path}").boxed());
app.at("/thumb/:point/:id").get(|r| proxy(r, "https://{point}.thumbs.redditmedia.com/{id}").boxed()); app.at("/thumb/:point/:id").get(|r| proxy(r, "https://{point}.thumbs.redditmedia.com/{id}").boxed());
app.at("/emoji/:id/:name").get(|r| proxy(r, "https://emoji.redditmedia.com/{id}/{name}").boxed()); app.at("/emoji/:id/:name").get(|r| proxy(r, "https://emoji.redditmedia.com/{id}/{name}").boxed());
app
.at("/emote/:subreddit_id/:filename")
.get(|r| proxy(r, "https://reddit-econ-prod-assets-permanent.s3.amazonaws.com/asset-manager/{subreddit_id}/{filename}").boxed());
app app
.at("/preview/:loc/award_images/:fullname/:id") .at("/preview/:loc/award_images/:fullname/:id")
.get(|r| proxy(r, "https://{loc}view.redd.it/award_images/{fullname}/{id}").boxed()); .get(|r| proxy(r, "https://{loc}view.redd.it/award_images/{fullname}/{id}").boxed());
@ -299,6 +294,7 @@ async fn main() {
app.at("/u/:name/comments/:id/:title/:comment_id").get(|r| post::item(r).boxed()); app.at("/u/:name/comments/:id/:title/:comment_id").get(|r| post::item(r).boxed());
app.at("/user/[deleted]").get(|req| error(req, "User has deleted their account").boxed()); app.at("/user/[deleted]").get(|req| error(req, "User has deleted their account").boxed());
app.at("/user/:name.rss").get(|r| user::rss(r).boxed());
app.at("/user/:name").get(|r| user::profile(r).boxed()); app.at("/user/:name").get(|r| user::profile(r).boxed());
app.at("/user/:name/:listing").get(|r| user::profile(r).boxed()); app.at("/user/:name/:listing").get(|r| user::profile(r).boxed());
app.at("/user/:name/comments/:id").get(|r| post::item(r).boxed()); app.at("/user/:name/comments/:id").get(|r| post::item(r).boxed());
@ -313,6 +309,9 @@ async fn main() {
// Mascots // Mascots
app.at("/mascot/:name").get(|r| mascot_image(r).boxed()); app.at("/mascot/:name").get(|r| mascot_image(r).boxed());
// RSS Subscriptions
app.at("/r/:sub.rss").get(|r| subreddit::rss(r).boxed());
// Subreddit services // Subreddit services
app app
.at("/r/:sub") .at("/r/:sub")
@ -385,7 +384,7 @@ async fn main() {
let sub = req.param("sub").unwrap_or_default(); let sub = req.param("sub").unwrap_or_default();
match req.param("id").as_deref() { match req.param("id").as_deref() {
// Share link // Share link
Some(id) if (8..12).contains(&id.len()) => match canonical_path(format!("/r/{sub}/s/{id}")).await { Some(id) if (8..12).contains(&id.len()) => match canonical_path(format!("/r/{sub}/s/{id}"), 3).await {
Ok(Some(path)) => Ok(redirect(&path)), Ok(Some(path)) => Ok(redirect(&path)),
Ok(None) => error(req, "Post ID is invalid. It may point to a post on a community that has been banned.").await, Ok(None) => error(req, "Post ID is invalid. It may point to a post on a community that has been banned.").await,
Err(e) => error(req, &e).await, Err(e) => error(req, &e).await,
@ -404,7 +403,7 @@ async fn main() {
Some("best" | "hot" | "new" | "top" | "rising" | "controversial") => subreddit::community(req).await, Some("best" | "hot" | "new" | "top" | "rising" | "controversial") => subreddit::community(req).await,
// Short link for post // Short link for post
Some(id) if (5..8).contains(&id.len()) => match canonical_path(format!("/{id}")).await { Some(id) if (5..8).contains(&id.len()) => match canonical_path(format!("/{id}"), 3).await {
Ok(path_opt) => match path_opt { Ok(path_opt) => match path_opt {
Some(path) => Ok(redirect(&path)), Some(path) => Ok(redirect(&path)),
None => error(req, "Post ID is invalid. It may point to a post on a community that has been banned.").await, None => error(req, "Post ID is invalid. It may point to a post on a community that has been banned.").await,
@ -430,3 +429,22 @@ async fn main() {
eprintln!("Server error: {e}"); eprintln!("Server error: {e}");
} }
} }
pub async fn proxy_commit_info() -> Result<Response<Body>, String> {
Ok(
Response::builder()
.status(200)
.header("content-type", "application/atom+xml")
.body(Body::from(fetch_commit_info().await))
.unwrap_or_default(),
)
}
#[cached(time = 600)]
async fn fetch_commit_info() -> String {
let uri = Uri::from_str("https://git.stardust.wtf/api/v1/repos/iridium/redsunlib/commits?verification=false&stat=false").expect("Invalid URI");
let resp: Body = CLIENT.get(uri).await.expect("Failed to request git.stardust.wtf").into_body();
hyper::body::to_bytes(resp).await.expect("Failed to read body").iter().copied().map(|x| x as char).collect()
}

View File

@ -1,18 +1,19 @@
use std::{collections::HashMap, sync::atomic::Ordering, time::Duration}; use std::{collections::HashMap, sync::atomic::Ordering, time::Duration};
use crate::{ use crate::{
client::{CLIENT, OAUTH_CLIENT, OAUTH_RATELIMIT_REMAINING}, client::{CLIENT, OAUTH_CLIENT, OAUTH_IS_ROLLING_OVER, OAUTH_RATELIMIT_REMAINING},
oauth_resources::ANDROID_APP_VERSION_LIST, oauth_resources::ANDROID_APP_VERSION_LIST,
}; };
use base64::{engine::general_purpose, Engine as _}; use base64::{engine::general_purpose, Engine as _};
use hyper::{client, Body, Method, Request}; use hyper::{client, Body, Method, Request};
use log::{info, trace}; use log::{error, info, trace};
use serde_json::json; use serde_json::json;
use tokio::time::{error::Elapsed, timeout};
static REDDIT_ANDROID_OAUTH_CLIENT_ID: &str = "ohXpoqrZYub1kg"; static REDDIT_ANDROID_OAUTH_CLIENT_ID: &str = "ohXpoqrZYub1kg";
static AUTH_ENDPOINT: &str = "https://accounts.reddit.com"; static AUTH_ENDPOINT: &str = "https://www.reddit.com";
// Spoofed client for Android devices // Spoofed client for Android devices
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
@ -25,11 +26,32 @@ pub struct Oauth {
} }
impl Oauth { impl Oauth {
/// Create a new OAuth client
pub(crate) async fn new() -> Self { pub(crate) async fn new() -> Self {
let mut oauth = Self::default(); // Call new_internal until it succeeds
oauth.login().await; loop {
oauth let attempt = Self::new_with_timeout().await;
match attempt {
Ok(Some(oauth)) => {
info!("[✅] Successfully created OAuth client");
return oauth;
}
Ok(None) => {
error!("Failed to create OAuth client. Retrying in 5 seconds...");
continue;
}
Err(duration) => {
error!("Failed to create OAuth client in {duration:?}. Retrying in 5 seconds...");
}
}
}
} }
async fn new_with_timeout() -> Result<Option<Self>, Elapsed> {
let mut oauth = Self::default();
timeout(Duration::from_secs(5), oauth.login()).await.map(|result| result.map(|_| oauth))
}
pub(crate) fn default() -> Self { pub(crate) fn default() -> Self {
// Generate a device to spoof // Generate a device to spoof
let device = Device::new(); let device = Device::new();
@ -46,7 +68,7 @@ impl Oauth {
} }
async fn login(&mut self) -> Option<()> { async fn login(&mut self) -> Option<()> {
// Construct URL for OAuth token // Construct URL for OAuth token
let url = format!("{AUTH_ENDPOINT}/api/access_token"); let url = format!("{AUTH_ENDPOINT}/auth/v2/oauth/access-token/loid");
let mut builder = Request::builder().method(Method::POST).uri(&url); let mut builder = Request::builder().method(Method::POST).uri(&url);
// Add headers from spoofed client // Add headers from spoofed client
@ -69,13 +91,19 @@ impl Oauth {
// Build request // Build request
let request = builder.body(body).unwrap(); let request = builder.body(body).unwrap();
trace!("Sending token request...");
// Send request // Send request
let client: client::Client<_, Body> = CLIENT.clone(); let client: &once_cell::sync::Lazy<client::Client<_, Body>> = &CLIENT;
let resp = client.request(request).await.ok()?; let resp = client.request(request).await.ok()?;
trace!("Received response with status {} and length {:?}", resp.status(), resp.headers().get("content-length"));
// Parse headers - loid header _should_ be saved sent on subsequent token refreshes. // Parse headers - loid header _should_ be saved sent on subsequent token refreshes.
// Technically it's not needed, but it's easy for Reddit API to check for this. // Technically it's not needed, but it's easy for Reddit API to check for this.
// It's some kind of header that uniquely identifies the device. // It's some kind of header that uniquely identifies the device.
// Not worried about the privacy implications, since this is randomly changed
// and really only as privacy-concerning as the OAuth token itself.
if let Some(header) = resp.headers().get("x-reddit-loid") { if let Some(header) = resp.headers().get("x-reddit-loid") {
self.headers_map.insert("x-reddit-loid".to_owned(), header.to_str().ok()?.to_string()); self.headers_map.insert("x-reddit-loid".to_owned(), header.to_str().ok()?.to_string());
} }
@ -85,10 +113,14 @@ impl Oauth {
self.headers_map.insert("x-reddit-session".to_owned(), header.to_str().ok()?.to_string()); self.headers_map.insert("x-reddit-session".to_owned(), header.to_str().ok()?.to_string());
} }
trace!("Serializing response...");
// Serialize response // Serialize response
let body_bytes = hyper::body::to_bytes(resp.into_body()).await.ok()?; let body_bytes = hyper::body::to_bytes(resp.into_body()).await.ok()?;
let json: serde_json::Value = serde_json::from_slice(&body_bytes).ok()?; let json: serde_json::Value = serde_json::from_slice(&body_bytes).ok()?;
trace!("Accessing relevant fields...");
// Save token and expiry // Save token and expiry
self.token = json.get("access_token")?.as_str()?.to_string(); self.token = json.get("access_token")?.as_str()?.to_string();
self.expires_in = json.get("expires_in")?.as_u64()?; self.expires_in = json.get("expires_in")?.as_u64()?;
@ -98,21 +130,13 @@ impl Oauth {
Some(()) Some(())
} }
async fn refresh(&mut self) -> Option<()> {
// Refresh is actually just a subsequent login with the same headers (without the old token
// or anything). This logic is handled in login, so we just call login again.
let refresh = self.login().await;
info!("Refreshing OAuth token... {}", if refresh.is_some() { "success" } else { "failed" });
refresh
}
} }
pub async fn token_daemon() { pub async fn token_daemon() {
// Monitor for refreshing token // Monitor for refreshing token
loop { loop {
// Get expiry time - be sure to not hold the read lock // Get expiry time - be sure to not hold the read lock
let expires_in = { OAUTH_CLIENT.read().await.expires_in }; let expires_in = { OAUTH_CLIENT.load_full().expires_in };
// sleep for the expiry time minus 2 minutes // sleep for the expiry time minus 2 minutes
let duration = Duration::from_secs(expires_in - 120); let duration = Duration::from_secs(expires_in - 120);
@ -125,14 +149,22 @@ pub async fn token_daemon() {
// Refresh token - in its own scope // Refresh token - in its own scope
{ {
OAUTH_CLIENT.write().await.refresh().await; force_refresh_token().await;
} }
} }
} }
pub async fn force_refresh_token() { pub async fn force_refresh_token() {
if OAUTH_IS_ROLLING_OVER.compare_exchange(false, true, Ordering::SeqCst, Ordering::SeqCst).is_err() {
trace!("Skipping refresh token roll over, already in progress");
return;
}
trace!("Rolling over refresh token. Current rate limit: {}", OAUTH_RATELIMIT_REMAINING.load(Ordering::SeqCst)); trace!("Rolling over refresh token. Current rate limit: {}", OAUTH_RATELIMIT_REMAINING.load(Ordering::SeqCst));
OAUTH_CLIENT.write().await.refresh().await; let new_client = Oauth::new().await;
OAUTH_CLIENT.swap(new_client.into());
OAUTH_RATELIMIT_REMAINING.store(99, Ordering::SeqCst);
OAUTH_IS_ROLLING_OVER.store(false, Ordering::SeqCst);
} }
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
@ -180,21 +212,21 @@ fn choose<T: Copy>(list: &[T]) -> T {
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_oauth_client() { async fn test_oauth_client() {
assert!(!OAUTH_CLIENT.read().await.token.is_empty()); assert!(!OAUTH_CLIENT.load_full().token.is_empty());
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_oauth_client_refresh() { async fn test_oauth_client_refresh() {
OAUTH_CLIENT.write().await.refresh().await.unwrap(); force_refresh_token().await;
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_oauth_token_exists() { async fn test_oauth_token_exists() {
assert!(!OAUTH_CLIENT.read().await.token.is_empty()); assert!(!OAUTH_CLIENT.load_full().token.is_empty());
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_oauth_headers_len() { async fn test_oauth_headers_len() {
assert!(OAUTH_CLIENT.read().await.headers_map.len() >= 3); assert!(OAUTH_CLIENT.load_full().headers_map.len() >= 3);
} }
#[test] #[test]

View File

@ -1,17 +1,19 @@
#![allow(clippy::cmp_owned)]
// CRATES // CRATES
use crate::client::json; use crate::client::json;
use crate::config::get_setting; use crate::config::get_setting;
use crate::server::RequestExt; use crate::server::RequestExt;
use crate::subreddit::{can_access_quarantine, quarantine}; use crate::subreddit::{can_access_quarantine, quarantine};
use crate::utils::{ use crate::utils::{
error, format_num, get_filters, nsfw_landing, param, parse_post, rewrite_urls, setting, template, time, val, Author, Awards, Comment, Flair, FlairPart, Post, Preferences, error, format_num, get_filters, nsfw_landing, param, parse_post, rewrite_emotes, setting, template, time, val, Author, Awards, Comment, Flair, FlairPart, Post, Preferences,
}; };
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use askama::Template;
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use regex::Regex; use regex::Regex;
use std::collections::HashSet; use rinja::Template;
use std::collections::{HashMap, HashSet};
// STRUCTS // STRUCTS
#[derive(Template)] #[derive(Template)]
@ -72,11 +74,15 @@ pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
return Ok(nsfw_landing(req, req_url).await.unwrap_or_default()); return Ok(nsfw_landing(req, req_url).await.unwrap_or_default());
} }
let query = match COMMENT_SEARCH_CAPTURE.captures(&url) { let query_body = match COMMENT_SEARCH_CAPTURE.captures(&url) {
Some(captures) => captures.get(1).unwrap().as_str().replace("%20", " ").replace('+', " "), Some(captures) => captures.get(1).unwrap().as_str().replace("%20", " ").replace('+', " "),
None => String::new(), None => String::new(),
}; };
let query_string = format!("q={query_body}&type=comment");
let form = url::form_urlencoded::parse(query_string.as_bytes()).collect::<HashMap<_, _>>();
let query = form.get("q").unwrap().clone().to_string();
let comments = match query.as_str() { let comments = match query.as_str() {
"" => parse_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &req), "" => parse_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &req),
_ => query_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &query, &req), _ => query_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &query, &req),
@ -174,7 +180,7 @@ fn build_comment(
get_setting("REDLIB_PUSHSHIFT_FRONTEND").unwrap_or_else(|| String::from(crate::config::DEFAULT_PUSHSHIFT_FRONTEND)), get_setting("REDLIB_PUSHSHIFT_FRONTEND").unwrap_or_else(|| String::from(crate::config::DEFAULT_PUSHSHIFT_FRONTEND)),
) )
} else { } else {
rewrite_urls(&val(comment, "body_html")) rewrite_emotes(&data["media_metadata"], val(comment, "body_html"))
}; };
let kind = comment["kind"].as_str().unwrap_or_default().to_string(); let kind = comment["kind"].as_str().unwrap_or_default().to_string();

75
src/scraper/main.rs Normal file
View File

@ -0,0 +1,75 @@
use std::{fmt::Display, io::Write};
use clap::{Parser, ValueEnum};
use redsunlib::utils::Post;
#[derive(Parser)]
#[command(name = "my_cli")]
#[command(about = "A simple CLI example", long_about = None)]
struct Cli {
#[arg(short = 's', long = "sub")]
sub: String,
#[arg(short = 'c', long = "count")]
count: usize,
#[arg(long = "sort")]
sort: SortOrder,
#[arg(short = 'f', long = "format", value_enum)]
format: Format,
#[arg(short = 'o', long = "output")]
output: Option<String>,
}
#[derive(Debug, Clone, ValueEnum)]
enum SortOrder {
Hot,
Rising,
New,
Top,
Controversial,
}
impl Display for SortOrder {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
SortOrder::Hot => write!(f, "hot"),
SortOrder::Rising => write!(f, "rising"),
SortOrder::New => write!(f, "new"),
SortOrder::Top => write!(f, "top"),
SortOrder::Controversial => write!(f, "controversial"),
}
}
}
#[derive(Debug, Clone, ValueEnum)]
enum Format {
Json,
}
#[tokio::main]
async fn main() {
let cli = Cli::parse();
let (sub, final_count, sort, format, output) = (cli.sub, cli.count, cli.sort, cli.format, cli.output);
let initial = format!("/r/{sub}/{sort}.json?&raw_json=1");
let (mut posts, mut after) = Post::fetch(&initial, false).await.unwrap();
while posts.len() < final_count {
print!("\r");
let path = format!("/r/{sub}/{sort}.json?sort={sort}&t=&after={after}&raw_json=1");
let (new_posts, new_after) = Post::fetch(&path, false).await.unwrap();
posts.extend(new_posts);
after = new_after;
// Print number of posts fetched
print!("Fetched {} posts", posts.len());
std::io::stdout().flush().unwrap();
}
match format {
Format::Json => {
let filename: String = output.unwrap_or_else(|| format!("{sub}.json"));
let json = serde_json::to_string(&posts).unwrap();
std::fs::write(filename, json).unwrap();
}
}
}

View File

@ -1,14 +1,16 @@
#![allow(clippy::cmp_owned)]
// CRATES // CRATES
use crate::utils::{self, catch_random, error, filter_posts, format_num, format_url, get_filters, param, redirect, setting, template, val, Post, Preferences}; use crate::utils::{self, catch_random, error, filter_posts, format_num, format_url, get_filters, param, redirect, setting, template, val, Post, Preferences};
use crate::{ use crate::{
client::json, client::json,
server::RequestExt,
subreddit::{can_access_quarantine, quarantine}, subreddit::{can_access_quarantine, quarantine},
RequestExt,
}; };
use askama::Template;
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use regex::Regex; use regex::Regex;
use rinja::Template;
// STRUCTS // STRUCTS
struct SearchParams { struct SearchParams {
@ -60,7 +62,8 @@ pub async fn find(req: Request<Body>) -> Result<Response<Body>, String> {
} else { } else {
"" ""
}; };
let path = format!("{}.json?{}{}&raw_json=1", req.uri().path(), req.uri().query().unwrap_or_default(), nsfw_results); let uri_path = req.uri().path().replace("+", "%2B");
let path = format!("{}.json?{}{}&raw_json=1", uri_path, req.uri().query().unwrap_or_default(), nsfw_results);
let mut query = param(&path, "q").unwrap_or_default(); let mut query = param(&path, "q").unwrap_or_default();
query = REDDIT_URL_MATCH.replace(&query, "").to_string(); query = REDDIT_URL_MATCH.replace(&query, "").to_string();
@ -68,10 +71,14 @@ pub async fn find(req: Request<Body>) -> Result<Response<Body>, String> {
return Ok(redirect("/")); return Ok(redirect("/"));
} }
if query.starts_with("r/") { if query.starts_with("r/") || query.starts_with("user/") {
return Ok(redirect(&format!("/{query}"))); return Ok(redirect(&format!("/{query}")));
} }
if query.starts_with("u/") {
return Ok(redirect(&format!("/user{}", &query[1..])));
}
let sub = req.param("sub").unwrap_or_default(); let sub = req.param("sub").unwrap_or_default();
let quarantined = can_access_quarantine(&req, &sub); let quarantined = can_access_quarantine(&req, &sub);
// Handle random subreddits // Handle random subreddits

View File

@ -1,4 +1,5 @@
#![allow(dead_code)] #![allow(dead_code)]
#![allow(clippy::cmp_owned)]
use brotli::enc::{BrotliCompress, BrotliEncoderParams}; use brotli::enc::{BrotliCompress, BrotliEncoderParams};
use cached::proc_macro::cached; use cached::proc_macro::cached;
@ -195,6 +196,12 @@ impl Route<'_> {
} }
} }
impl Default for Server {
fn default() -> Self {
Self::new()
}
}
impl Server { impl Server {
pub fn new() -> Self { pub fn new() -> Self {
Self { Self {
@ -723,7 +730,7 @@ mod tests {
CompressionType::Brotli => Box::new(BrotliDecompressor::new(body_cursor, expected_lorem_ipsum.len())), CompressionType::Brotli => Box::new(BrotliDecompressor::new(body_cursor, expected_lorem_ipsum.len())),
_ => panic!("no decompressor for {}", expected_encoding.to_string()), _ => panic!("no decompressor for {}", expected_encoding),
}; };
let mut decompressed = Vec::<u8>::new(); let mut decompressed = Vec::<u8>::new();

View File

@ -1,12 +1,14 @@
#![allow(clippy::cmp_owned)]
use std::collections::HashMap; use std::collections::HashMap;
// CRATES // CRATES
use crate::server::ResponseExt; use crate::server::ResponseExt;
use crate::utils::{redirect, template, Preferences}; use crate::utils::{redirect, template, Preferences};
use askama::Template;
use cookie::Cookie; use cookie::Cookie;
use futures_lite::StreamExt; use futures_lite::StreamExt;
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use rinja::Template;
use time::{Duration, OffsetDateTime}; use time::{Duration, OffsetDateTime};
// STRUCTS // STRUCTS
@ -19,7 +21,7 @@ struct SettingsTemplate {
// CONSTANTS // CONSTANTS
const PREFS: [&str; 19] = [ const PREFS: [&str; 20] = [
"theme", "theme",
"mascot", "mascot",
"front_page", "front_page",
@ -39,6 +41,7 @@ const PREFS: [&str; 19] = [
"hide_awards", "hide_awards",
"hide_score", "hide_score",
"disable_visit_reddit_confirmation", "disable_visit_reddit_confirmation",
"video_quality",
]; ];
// FUNCTIONS // FUNCTIONS

View File

@ -1,11 +1,14 @@
#![allow(clippy::cmp_owned)]
use crate::{config, utils};
// CRATES // CRATES
use crate::utils::{ use crate::utils::{
catch_random, error, filter_posts, format_num, format_url, get_filters, nsfw_landing, param, redirect, rewrite_urls, setting, template, val, Post, Preferences, Subreddit, catch_random, error, filter_posts, format_num, format_url, get_filters, nsfw_landing, param, redirect, rewrite_urls, setting, template, val, Post, Preferences, Subreddit,
}; };
use crate::{client::json, server::ResponseExt, RequestExt}; use crate::{client::json, server::RequestExt, server::ResponseExt};
use askama::Template;
use cookie::Cookie; use cookie::Cookie;
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use rinja::Template;
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use regex::Regex; use regex::Regex;
@ -64,7 +67,7 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
let post_sort = req.cookie("post_sort").map_or_else(|| "hot".to_string(), |c| c.value().to_string()); let post_sort = req.cookie("post_sort").map_or_else(|| "hot".to_string(), |c| c.value().to_string());
let sort = req.param("sort").unwrap_or_else(|| req.param("id").unwrap_or(post_sort)); let sort = req.param("sort").unwrap_or_else(|| req.param("id").unwrap_or(post_sort));
let mut sub_name = req.param("sub").unwrap_or(if front_page == "default" || front_page.is_empty() { let sub_name = req.param("sub").unwrap_or(if front_page == "default" || front_page.is_empty() {
if subscribed.is_empty() { if subscribed.is_empty() {
"popular".to_string() "popular".to_string()
} else { } else {
@ -84,11 +87,6 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
return Ok(redirect(&["/user/", &sub_name[2..]].concat())); return Ok(redirect(&["/user/", &sub_name[2..]].concat()));
} }
// If multi-sub, replace + with url encoded +
if sub_name.contains('+') {
sub_name = sub_name.replace('+', "%2B");
}
// Request subreddit metadata // Request subreddit metadata
let sub = if !sub_name.contains('+') && sub_name != subscribed && sub_name != "popular" && sub_name != "all" { let sub = if !sub_name.contains('+') && sub_name != subscribed && sub_name != "popular" && sub_name != "all" {
// Regular subreddit // Regular subreddit
@ -124,7 +122,7 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
params.push_str(&format!("&geo_filter={geo_filter}")); params.push_str(&format!("&geo_filter={geo_filter}"));
} }
let path = format!("/r/{sub_name}/{sort}.json?{}{params}", req.uri().query().unwrap_or_default()); let path = format!("/r/{}/{sort}.json?{}{params}", sub_name.replace('+', "%2B"), req.uri().query().unwrap_or_default());
let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str())); let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str()));
let redirect_url = url[1..].replace('?', "%3F").replace('&', "%26").replace('+', "%2B"); let redirect_url = url[1..].replace('?', "%3F").replace('&', "%26").replace('+', "%2B");
let filters = get_filters(&req); let filters = get_filters(&req);
@ -150,6 +148,10 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
let (_, all_posts_filtered) = filter_posts(&mut posts, &filters); let (_, all_posts_filtered) = filter_posts(&mut posts, &filters);
let no_posts = posts.is_empty(); let no_posts = posts.is_empty();
let all_posts_hidden_nsfw = !no_posts && (posts.iter().all(|p| p.flags.nsfw) && setting(&req, "show_nsfw") != "on"); let all_posts_hidden_nsfw = !no_posts && (posts.iter().all(|p| p.flags.nsfw) && setting(&req, "show_nsfw") != "on");
if sort == "new" {
posts.sort_by(|a, b| b.created_ts.cmp(&a.created_ts));
posts.sort_by(|a, b| b.flags.stickied.cmp(&a.flags.stickied));
}
Ok(template(&SubredditTemplate { Ok(template(&SubredditTemplate {
sub, sub,
posts, posts,
@ -460,8 +462,71 @@ async fn subreddit(sub: &str, quarantined: bool) -> Result<Subreddit, String> {
}) })
} }
pub async fn rss(req: Request<Body>) -> Result<Response<Body>, String> {
if config::get_setting("REDLIB_ENABLE_RSS").is_none() {
return Ok(error(req, "RSS is disabled on this instance.").await.unwrap_or_default());
}
use hyper::header::CONTENT_TYPE;
use rss::{ChannelBuilder, Item};
// Get subreddit
let sub = req.param("sub").unwrap_or_default();
let post_sort = req.cookie("post_sort").map_or_else(|| "hot".to_string(), |c| c.value().to_string());
let sort = req.param("sort").unwrap_or_else(|| req.param("id").unwrap_or(post_sort));
// Get path
let path = format!("/r/{sub}/{sort}.json?{}", req.uri().query().unwrap_or_default());
// Get subreddit data
let subreddit = subreddit(&sub, false).await?;
// Get posts
let (posts, _) = Post::fetch(&path, false).await?;
// Build the RSS feed
let channel = ChannelBuilder::default()
.title(&subreddit.title)
.description(&subreddit.description)
.items(
posts
.into_iter()
.map(|post| Item {
title: Some(post.title.to_string()),
link: Some(utils::get_post_url(&post)),
author: Some(post.author.name),
content: Some(rewrite_urls(&post.body)),
description: Some(format!(
"<a href='{}{}'>Comments</a>",
config::get_setting("REDLIB_FULL_URL").unwrap_or_default(),
post.permalink
)),
..Default::default()
})
.collect::<Vec<_>>(),
)
.build();
// Serialize the feed to RSS
let body = channel.to_string().into_bytes();
// Create the HTTP response
let mut res = Response::new(Body::from(body));
res.headers_mut().insert(CONTENT_TYPE, hyper::header::HeaderValue::from_static("application/rss+xml"));
Ok(res)
}
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_fetching_subreddit() { async fn test_fetching_subreddit() {
let subreddit = subreddit("rust", false).await; let subreddit = subreddit("rust", false).await;
assert!(subreddit.is_ok()); assert!(subreddit.is_ok());
} }
#[tokio::test(flavor = "multi_thread")]
async fn test_gated_and_quarantined() {
let quarantined = subreddit("edgy", true).await;
assert!(quarantined.is_ok());
let gated = subreddit("drugs", true).await;
assert!(gated.is_ok());
}

View File

@ -1,9 +1,12 @@
#![allow(clippy::cmp_owned)]
// CRATES // CRATES
use crate::client::json; use crate::client::json;
use crate::server::RequestExt; use crate::server::RequestExt;
use crate::utils::{error, filter_posts, format_url, get_filters, nsfw_landing, param, setting, template, Post, Preferences, User}; use crate::utils::{error, filter_posts, format_url, get_filters, nsfw_landing, param, setting, template, Post, Preferences, User};
use askama::Template; use crate::{config, utils};
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use rinja::Template;
use time::{macros::format_description, OffsetDateTime}; use time::{macros::format_description, OffsetDateTime};
// STRUCTS // STRUCTS
@ -129,6 +132,56 @@ async fn user(name: &str) -> Result<User, String> {
}) })
} }
pub async fn rss(req: Request<Body>) -> Result<Response<Body>, String> {
if config::get_setting("REDLIB_ENABLE_RSS").is_none() {
return Ok(error(req, "RSS is disabled on this instance.").await.unwrap_or_default());
}
use crate::utils::rewrite_urls;
use hyper::header::CONTENT_TYPE;
use rss::{ChannelBuilder, Item};
// Get user
let user_str = req.param("name").unwrap_or_default();
let listing = req.param("listing").unwrap_or_else(|| "overview".to_string());
// Get path
let path = format!("/user/{user_str}/{listing}.json?{}&raw_json=1", req.uri().query().unwrap_or_default(),);
// Get user
let user_obj = user(&user_str).await.unwrap_or_default();
// Get posts
let (posts, _) = Post::fetch(&path, false).await?;
// Build the RSS feed
let channel = ChannelBuilder::default()
.title(user_str)
.description(user_obj.description)
.items(
posts
.into_iter()
.map(|post| Item {
title: Some(post.title.to_string()),
link: Some(utils::get_post_url(&post)),
author: Some(post.author.name),
content: Some(rewrite_urls(&post.body)),
..Default::default()
})
.collect::<Vec<_>>(),
)
.build();
// Serialize the feed to RSS
let body = channel.to_string().into_bytes();
// Create the HTTP response
let mut res = Response::new(Body::from(body));
res.headers_mut().insert(CONTENT_TYPE, hyper::header::HeaderValue::from_static("application/rss+xml"));
Ok(res)
}
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_fetching_user() { async fn test_fetching_user() {
let user = user("spez").await; let user = user("spez").await;

View File

@ -1,20 +1,25 @@
#![allow(dead_code)] #![allow(dead_code)]
use crate::config::get_setting; #![allow(clippy::cmp_owned)]
use crate::config::{self, get_setting};
// //
// CRATES // CRATES
// //
use crate::{client::json, server::RequestExt}; use crate::{client::json, server::RequestExt};
use askama::Template;
use cookie::Cookie; use cookie::Cookie;
use hyper::{Body, Request, Response}; use hyper::{Body, Request, Response};
use log::error; use log::error;
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use regex::Regex; use regex::Regex;
use rinja::Template;
use rust_embed::RustEmbed; use rust_embed::RustEmbed;
use serde::Serialize;
use serde_json::Value; use serde_json::Value;
use serde_json_path::{JsonPath, JsonPathExt};
use std::collections::{HashMap, HashSet}; use std::collections::{HashMap, HashSet};
use std::env; use std::env;
use std::str::FromStr; use std::str::FromStr;
use std::string::ToString;
use time::{macros::format_description, Duration, OffsetDateTime}; use time::{macros::format_description, Duration, OffsetDateTime};
use url::Url; use url::Url;
@ -44,6 +49,7 @@ pub enum ResourceType {
} }
// Post flair with content, background color and foreground color // Post flair with content, background color and foreground color
#[derive(Serialize)]
pub struct Flair { pub struct Flair {
pub flair_parts: Vec<FlairPart>, pub flair_parts: Vec<FlairPart>,
pub text: String, pub text: String,
@ -52,7 +58,7 @@ pub struct Flair {
} }
// Part of flair, either emoji or text // Part of flair, either emoji or text
#[derive(Clone)] #[derive(Clone, Serialize)]
pub struct FlairPart { pub struct FlairPart {
pub flair_part_type: String, pub flair_part_type: String,
pub value: String, pub value: String,
@ -94,12 +100,14 @@ impl FlairPart {
} }
} }
#[derive(Serialize)]
pub struct Author { pub struct Author {
pub name: String, pub name: String,
pub flair: Flair, pub flair: Flair,
pub distinguished: String, pub distinguished: String,
} }
#[derive(Serialize)]
pub struct Poll { pub struct Poll {
pub poll_options: Vec<PollOption>, pub poll_options: Vec<PollOption>,
pub voting_end_timestamp: (String, String), pub voting_end_timestamp: (String, String),
@ -127,6 +135,7 @@ impl Poll {
} }
} }
#[derive(Serialize)]
pub struct PollOption { pub struct PollOption {
pub id: u64, pub id: u64,
pub text: String, pub text: String,
@ -156,19 +165,21 @@ impl PollOption {
} }
// Post flags with nsfw and stickied // Post flags with nsfw and stickied
#[derive(Serialize)]
pub struct Flags { pub struct Flags {
pub spoiler: bool, pub spoiler: bool,
pub nsfw: bool, pub nsfw: bool,
pub stickied: bool, pub stickied: bool,
} }
#[derive(Debug)] #[derive(Debug, Serialize)]
pub struct Media { pub struct Media {
pub url: String, pub url: String,
pub alt_url: String, pub alt_url: String,
pub width: i64, pub width: i64,
pub height: i64, pub height: i64,
pub poster: String, pub poster: String,
pub download_name: String,
} }
impl Media { impl Media {
@ -235,6 +246,15 @@ impl Media {
let alt_url = alt_url_val.map_or(String::new(), |val| format_url(val.as_str().unwrap_or_default())); let alt_url = alt_url_val.map_or(String::new(), |val| format_url(val.as_str().unwrap_or_default()));
let download_name = if post_type == "image" || post_type == "gif" || post_type == "video" {
let permalink_base = url_path_basename(data["permalink"].as_str().unwrap_or_default());
let media_url_base = url_path_basename(url_val.as_str().unwrap_or_default());
format!("redlib_{permalink_base}_{media_url_base}")
} else {
String::new()
};
( (
post_type.to_string(), post_type.to_string(),
Self { Self {
@ -245,12 +265,14 @@ impl Media {
width: source["width"].as_i64().unwrap_or_default(), width: source["width"].as_i64().unwrap_or_default(),
height: source["height"].as_i64().unwrap_or_default(), height: source["height"].as_i64().unwrap_or_default(),
poster: format_url(source["url"].as_str().unwrap_or_default()), poster: format_url(source["url"].as_str().unwrap_or_default()),
download_name,
}, },
gallery, gallery,
) )
} }
} }
#[derive(Serialize)]
pub struct GalleryMedia { pub struct GalleryMedia {
pub url: String, pub url: String,
pub width: i64, pub width: i64,
@ -291,6 +313,7 @@ impl GalleryMedia {
} }
// Post containing content, metadata and media // Post containing content, metadata and media
#[derive(Serialize)]
pub struct Post { pub struct Post {
pub id: String, pub id: String,
pub title: String, pub title: String,
@ -298,6 +321,7 @@ pub struct Post {
pub body: String, pub body: String,
pub author: Author, pub author: Author,
pub permalink: String, pub permalink: String,
pub link_title: String,
pub poll: Option<Poll>, pub poll: Option<Poll>,
pub score: (String, String), pub score: (String, String),
pub upvote_ratio: i64, pub upvote_ratio: i64,
@ -309,11 +333,13 @@ pub struct Post {
pub domain: String, pub domain: String,
pub rel_time: String, pub rel_time: String,
pub created: String, pub created: String,
pub created_ts: u64,
pub num_duplicates: u64, pub num_duplicates: u64,
pub comments: (String, String), pub comments: (String, String),
pub gallery: Vec<GalleryMedia>, pub gallery: Vec<GalleryMedia>,
pub awards: Awards, pub awards: Awards,
pub nsfw: bool, pub nsfw: bool,
pub out_url: Option<String>,
pub ws_url: String, pub ws_url: String,
} }
@ -340,6 +366,7 @@ impl Post {
let data = &post["data"]; let data = &post["data"];
let (rel_time, created) = time(data["created_utc"].as_f64().unwrap_or_default()); let (rel_time, created) = time(data["created_utc"].as_f64().unwrap_or_default());
let created_ts = data["created_utc"].as_f64().unwrap_or_default().round() as u64;
let score = data["score"].as_i64().unwrap_or_default(); let score = data["score"].as_i64().unwrap_or_default();
let ratio: f64 = data["upvote_ratio"].as_f64().unwrap_or(1.0) * 100.0; let ratio: f64 = data["upvote_ratio"].as_f64().unwrap_or(1.0) * 100.0;
let title = val(post, "title"); let title = val(post, "title");
@ -386,6 +413,7 @@ impl Post {
width: data["thumbnail_width"].as_i64().unwrap_or_default(), width: data["thumbnail_width"].as_i64().unwrap_or_default(),
height: data["thumbnail_height"].as_i64().unwrap_or_default(), height: data["thumbnail_height"].as_i64().unwrap_or_default(),
poster: String::new(), poster: String::new(),
download_name: String::new(),
}, },
media, media,
domain: val(post, "domain"), domain: val(post, "domain"),
@ -409,18 +437,20 @@ impl Post {
stickied: data["stickied"].as_bool().unwrap_or_default() || data["pinned"].as_bool().unwrap_or_default(), stickied: data["stickied"].as_bool().unwrap_or_default() || data["pinned"].as_bool().unwrap_or_default(),
}, },
permalink: val(post, "permalink"), permalink: val(post, "permalink"),
link_title: val(post, "link_title"),
poll: Poll::parse(&data["poll_data"]), poll: Poll::parse(&data["poll_data"]),
rel_time, rel_time,
created, created,
created_ts,
num_duplicates: post["data"]["num_duplicates"].as_u64().unwrap_or(0), num_duplicates: post["data"]["num_duplicates"].as_u64().unwrap_or(0),
comments: format_num(data["num_comments"].as_i64().unwrap_or_default()), comments: format_num(data["num_comments"].as_i64().unwrap_or_default()),
gallery, gallery,
awards, awards,
nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(), nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(),
ws_url: val(post, "websocket_url"), ws_url: val(post, "websocket_url"),
out_url: post["data"]["url_overridden_by_dest"].as_str().map(|a| a.to_string()),
}); });
} }
Ok((posts, res["data"]["after"].as_str().unwrap_or_default().to_string())) Ok((posts, res["data"]["after"].as_str().unwrap_or_default().to_string()))
} }
} }
@ -450,7 +480,7 @@ pub struct Comment {
pub prefs: Preferences, pub prefs: Preferences,
} }
#[derive(Default, Clone)] #[derive(Default, Clone, Serialize)]
pub struct Award { pub struct Award {
pub name: String, pub name: String,
pub icon_url: String, pub icon_url: String,
@ -464,6 +494,7 @@ impl std::fmt::Display for Award {
} }
} }
#[derive(Serialize)]
pub struct Awards(pub Vec<Award>); pub struct Awards(pub Vec<Award>);
impl std::ops::Deref for Awards { impl std::ops::Deref for Awards {
@ -583,6 +614,7 @@ pub struct Preferences {
pub show_nsfw: String, pub show_nsfw: String,
pub blur_nsfw: String, pub blur_nsfw: String,
pub hide_hls_notification: String, pub hide_hls_notification: String,
pub video_quality: String,
pub hide_sidebar_and_summary: String, pub hide_sidebar_and_summary: String,
pub use_hls: String, pub use_hls: String,
pub ffmpeg_video_downloads: String, pub ffmpeg_video_downloads: String,
@ -639,6 +671,7 @@ impl Preferences {
use_hls: setting(req, "use_hls"), use_hls: setting(req, "use_hls"),
ffmpeg_video_downloads: setting(req, "ffmpeg_video_downloads"), ffmpeg_video_downloads: setting(req, "ffmpeg_video_downloads"),
hide_hls_notification: setting(req, "hide_hls_notification"), hide_hls_notification: setting(req, "hide_hls_notification"),
video_quality: setting(req, "video_quality"),
autoplay_videos: setting(req, "autoplay_videos"), autoplay_videos: setting(req, "autoplay_videos"),
fixed_navbar: setting_or_default(req, "fixed_navbar", "on".to_string()), fixed_navbar: setting_or_default(req, "fixed_navbar", "on".to_string()),
disable_visit_reddit_confirmation: setting(req, "disable_visit_reddit_confirmation"), disable_visit_reddit_confirmation: setting(req, "disable_visit_reddit_confirmation"),
@ -691,6 +724,8 @@ pub async fn parse_post(post: &Value) -> Post {
// Determine the type of media along with the media URL // Determine the type of media along with the media URL
let (post_type, media, gallery) = Media::parse(&post["data"]).await; let (post_type, media, gallery) = Media::parse(&post["data"]).await;
let created_ts = post["data"]["created_utc"].as_f64().unwrap_or_default().round() as u64;
let awards: Awards = Awards::parse(&post["data"]["all_awardings"]); let awards: Awards = Awards::parse(&post["data"]["all_awardings"]);
let permalink = val(post, "permalink"); let permalink = val(post, "permalink");
@ -727,6 +762,7 @@ pub async fn parse_post(post: &Value) -> Post {
distinguished: val(post, "distinguished"), distinguished: val(post, "distinguished"),
}, },
permalink, permalink,
link_title: val(post, "link_title"),
poll, poll,
score: format_num(score), score: format_num(score),
upvote_ratio: ratio as i64, upvote_ratio: ratio as i64,
@ -738,6 +774,7 @@ pub async fn parse_post(post: &Value) -> Post {
width: post["data"]["thumbnail_width"].as_i64().unwrap_or_default(), width: post["data"]["thumbnail_width"].as_i64().unwrap_or_default(),
height: post["data"]["thumbnail_height"].as_i64().unwrap_or_default(), height: post["data"]["thumbnail_height"].as_i64().unwrap_or_default(),
poster: String::new(), poster: String::new(),
download_name: String::new(),
}, },
flair: Flair { flair: Flair {
flair_parts: FlairPart::parse( flair_parts: FlairPart::parse(
@ -761,12 +798,14 @@ pub async fn parse_post(post: &Value) -> Post {
domain: val(post, "domain"), domain: val(post, "domain"),
rel_time, rel_time,
created, created,
created_ts,
num_duplicates: post["data"]["num_duplicates"].as_u64().unwrap_or(0), num_duplicates: post["data"]["num_duplicates"].as_u64().unwrap_or(0),
comments: format_num(post["data"]["num_comments"].as_i64().unwrap_or_default()), comments: format_num(post["data"]["num_comments"].as_i64().unwrap_or_default()),
gallery, gallery,
awards, awards,
nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(), nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(),
ws_url: val(post, "websocket_url"), ws_url: val(post, "websocket_url"),
out_url: post["data"]["url_overridden_by_dest"].as_str().map(|a| a.to_string()),
} }
} }
@ -912,12 +951,19 @@ pub fn rewrite_urls(input_text: &str) -> String {
// Rewrite Reddit links to Redlib // Rewrite Reddit links to Redlib
REDDIT_REGEX.replace_all(input_text, r#"href="/"#) REDDIT_REGEX.replace_all(input_text, r#"href="/"#)
.to_string(); .to_string();
text1 = REDDIT_EMOJI_REGEX
.replace_all(&text1, format_url(REDDIT_EMOJI_REGEX.find(&text1).map(|x| x.as_str()).unwrap_or_default())) loop {
.to_string() if REDDIT_EMOJI_REGEX.find(&text1).is_none() {
// Remove (html-encoded) "\" from URLs. break;
.replace("%5C", "") } else {
.replace("\\_", "_"); text1 = REDDIT_EMOJI_REGEX
.replace_all(&text1, format_url(REDDIT_EMOJI_REGEX.find(&text1).map(|x| x.as_str()).unwrap_or_default()))
.to_string()
}
}
// Remove (html-encoded) "\" from URLs.
text1 = text1.replace("%5C", "").replace("\\_", "_");
// Rewrite external media previews to Redlib // Rewrite external media previews to Redlib
loop { loop {
@ -973,6 +1019,83 @@ pub fn rewrite_urls(input_text: &str) -> String {
} }
} }
// These links all follow a pattern of "https://reddit-econ-prod-assets-permanent.s3.amazonaws.com/asset-manager/SUBREDDIT_ID/RANDOM_FILENAME.png"
static REDDIT_EMOTE_LINK_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r#"https://reddit-econ-prod-assets-permanent.s3.amazonaws.com/asset-manager/(.*)"#).unwrap());
// These all follow a pattern of '"emote|SUBREDDIT_IT|NUMBER"', we want the number
static REDDIT_EMOTE_ID_NUMBER_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r#""emote\|.*\|(.*)""#).unwrap());
pub fn rewrite_emotes(media_metadata: &Value, comment: String) -> String {
/* Create the paths we'll use to look for our data inside the json.
Because we don't know the name of any given emote we use a wildcard to parse them. */
let link_path = JsonPath::parse("$[*].s.u").expect("valid JSON Path");
let id_path = JsonPath::parse("$[*].id").expect("valid JSON Path");
let size_path = JsonPath::parse("$[*].s.y").expect("valid JSON Path");
// Extract all of the results from those json paths
let link_nodes = media_metadata.json_path(&link_path);
let id_nodes = media_metadata.json_path(&id_path);
// Initialize our vectors
let mut id_vec = Vec::new();
let mut link_vec = Vec::new();
// Add the relevant data to each of our vectors so we can access it by number later
for current_id in id_nodes {
id_vec.push(current_id)
}
for current_link in link_nodes {
link_vec.push(current_link)
}
/* Set index to the length of link_vec.
This is one larger than we'll actually be looking at, but we correct that later */
let mut index = link_vec.len();
// Comment needs to be in scope for when we call rewrite_urls()
let mut comment = comment;
/* Loop until index hits zero.
This also prevents us from trying to do anything on an empty vector */
while index != 0 {
/* Subtract 1 from index to get the real index we should be looking at.
Then continue on each subsequent loop to continue until we hit the last entry in the vector.
This is how we get this to deal with multiple emotes in a single message and properly replace each ID with it's link */
index -= 1;
// Convert our current index in id_vec into a string so we can search through it with regex
let current_id = id_vec[index].to_string();
/* The ID number can be multiple lengths, so we capture it with regex.
We also want to only attempt anything when we get matches to avoid panicking */
if let Some(id_capture) = REDDIT_EMOTE_ID_NUMBER_REGEX.captures(&current_id) {
// Format the ID to include the colons it has in the comment text
let id = format!(":{}:", &id_capture[1]);
// Convert current link to string to search through it with the regex
let link = link_vec[index].to_string();
// Make sure we only do operations when we get matches, otherwise we panic when trying to access the first match
if let Some(link_capture) = REDDIT_EMOTE_LINK_REGEX.captures(&link) {
/* Reddit sends a size for the image based on whether it's alone or accompanied by text.
It's a good idea and makes everything look nicer, so we'll do the same. */
let size = media_metadata.json_path(&size_path).first().unwrap().to_string();
// Replace the ID we found earlier in the comment with the respective image and it's link from the regex capture
let to_replace_with = format!(
"<img loading=\"lazy\" src=\"/emote/{} width=\"{size}\" height=\"{size}\" style=\"vertical-align:text-bottom\">",
&link_capture[1]
);
// Inside the comment replace the ID we found with the string that will embed the image
comment = comment.replace(&id, &to_replace_with).to_string();
}
}
}
// Call rewrite_urls() to transform any other Reddit links
rewrite_urls(&comment)
}
// Format vote count to a string that will be displayed. // Format vote count to a string that will be displayed.
// Append `m` and `k` for millions and thousands respectively, and // Append `m` and `k` for millions and thousands respectively, and
// round to the nearest tenth. // round to the nearest tenth.
@ -1079,6 +1202,28 @@ pub fn sfw_only() -> bool {
} }
} }
/// Returns true if the config/env variable REDLIB_ENABLE_RSS is set to "on".
/// If this variable is set as such, the instance will enable RSS feeds.
/// Otherwise, the instance will not provide RSS feeds.
pub fn enable_rss() -> bool {
match get_setting("REDLIB_ENABLE_RSS") {
Some(val) => val == "on",
None => false,
}
}
/// Returns true if the config/env variable `REDLIB_ROBOTS_DISABLE_INDEXING` carries the
/// value `on`.
///
/// If this variable is set as such, the instance will block all robots in robots.txt and
/// insert the noindex, nofollow meta tag on every page.
pub fn disable_indexing() -> bool {
match get_setting("REDLIB_ROBOTS_DISABLE_INDEXING") {
Some(val) => val == "on",
None => false,
}
}
// Determines if a request shoud redirect to a nsfw landing gate. // Determines if a request shoud redirect to a nsfw landing gate.
pub fn should_be_nsfw_gated(req: &Request<Body>, req_url: &str) -> bool { pub fn should_be_nsfw_gated(req: &Request<Body>, req_url: &str) -> bool {
let sfw_instance = sfw_only(); let sfw_instance = sfw_only();
@ -1120,6 +1265,34 @@ pub async fn nsfw_landing(req: Request<Body>, req_url: String) -> Result<Respons
Ok(Response::builder().status(403).header("content-type", "text/html").body(body.into()).unwrap_or_default()) Ok(Response::builder().status(403).header("content-type", "text/html").body(body.into()).unwrap_or_default())
} }
// Returns the last (non-empty) segment of a path string
pub fn url_path_basename(path: &str) -> String {
let url_result = Url::parse(format!("https://libredd.it/{path}").as_str());
if url_result.is_err() {
path.to_string()
} else {
let mut url = url_result.unwrap();
url.path_segments_mut().unwrap().pop_if_empty();
url.path_segments().unwrap().last().unwrap().to_string()
}
}
// Returns the URL of a post, as needed by RSS feeds
pub fn get_post_url(post: &Post) -> String {
if let Some(out_url) = &post.out_url {
// Handle cross post
if out_url.starts_with("/r/") {
format!("{}{}", config::get_setting("REDLIB_FULL_URL").unwrap_or_default(), out_url)
} else {
out_url.to_string()
}
} else {
format!("{}{}", config::get_setting("REDLIB_FULL_URL").unwrap_or_default(), post.permalink)
}
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::{format_num, format_url, rewrite_urls}; use super::{format_num, format_url, rewrite_urls};
@ -1228,3 +1401,27 @@ fn test_rewriting_image_links() {
let output = r#"<p><figure><a href="/preview/pre/6awags382xo31.png?width=2560&amp;format=png&amp;auto=webp&amp;s=9c563aed4f07a91bdd249b5a3cea43a79710dcfc"><img loading="lazy" src="/preview/pre/6awags382xo31.png?width=2560&amp;format=png&amp;auto=webp&amp;s=9c563aed4f07a91bdd249b5a3cea43a79710dcfc"></a><figcaption>caption 1</figcaption></figure></p"#; let output = r#"<p><figure><a href="/preview/pre/6awags382xo31.png?width=2560&amp;format=png&amp;auto=webp&amp;s=9c563aed4f07a91bdd249b5a3cea43a79710dcfc"><img loading="lazy" src="/preview/pre/6awags382xo31.png?width=2560&amp;format=png&amp;auto=webp&amp;s=9c563aed4f07a91bdd249b5a3cea43a79710dcfc"></a><figcaption>caption 1</figcaption></figure></p"#;
assert_eq!(rewrite_urls(input), output); assert_eq!(rewrite_urls(input), output);
} }
#[test]
fn test_url_path_basename() {
// without trailing slash
assert_eq!(url_path_basename("/first/last"), "last");
// with trailing slash
assert_eq!(url_path_basename("/first/last/"), "last");
// with query parameters
assert_eq!(url_path_basename("/first/last/?some=query"), "last");
// file path
assert_eq!(url_path_basename("/cdn/image.jpg"), "image.jpg");
// when a full url is passed instead of just a path
assert_eq!(url_path_basename("https://doma.in/first/last"), "last");
// empty path
assert_eq!(url_path_basename("/"), "");
}
#[test]
fn test_rewriting_emotes() {
let json_input = serde_json::from_str(r#"{"emote|t5_31hpy|2028":{"e":"Image","id":"emote|t5_31hpy|2028","m":"image/png","s":{"u":"https://reddit-econ-prod-assets-permanent.s3.amazonaws.com/asset-manager/t5_31hpy/PW6WsOaLcd.png","x":60,"y":60},"status":"valid","t":"sticker"}}"#).expect("Valid JSON");
let comment_input = r#"<div class="comment_body "><div class="md"><p>:2028:</p></div></div>"#;
let output = r#"<div class="comment_body "><div class="md"><p><img loading="lazy" src="/emote/t5_31hpy/PW6WsOaLcd.png" width="60" height="60" style="vertical-align:text-bottom"></p></div></div>"#;
assert_eq!(rewrite_emotes(&json_input, comment_input.to_string()), output);
}

35
static/check_update.js Normal file
View File

@ -0,0 +1,35 @@
async function checkInstanceUpdateStatus() {
try {
const response = await fetch('/commits.json');
const text = await response.text();
const entries = JSON.parse(text);
const localCommit = document.getElementById('git_commit').dataset.value;
let statusMessage = '';
if (entries.length > 0) {
const commitHashes = Array.from(entries).map(entry => {
return entry.sha
});
const commitIndex = commitHashes.indexOf(localCommit);
if (commitIndex === 0) {
statusMessage = '✅ Instance is up to date.';
} else if (commitIndex > 0) {
statusMessage = `⚠️ This instance is not up to date and is ${commitIndex} commits old. Test and confirm on an up-to-date instance before reporting.`;
} else {
statusMessage = `⚠️ This instance is not up to date and is at least ${commitHashes.length} commits old. Test and confirm on an up-to-date instance before reporting.`;
}
} else {
statusMessage = '⚠️ Unable to fetch commit information.';
}
document.getElementById('update-status').innerText = statusMessage;
} catch (error) {
console.error('Error fetching commits:', error);
document.getElementById('update-status').innerText = '⚠️ Error checking update status.';
}
}
checkInstanceUpdateStatus();

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
/* Black theme setting */ /* Black theme setting */
.black { .black {
--accent: #009a9a; --accent: #bb2b3b;
--green: #00a229; --green: #00a229;
--text: white; --text: white;
--foreground: #0f0f0f; --foreground: #0f0f0f;

View File

@ -1,6 +1,6 @@
/* Dark theme setting */ /* Dark theme setting */
.dark{ .dark{
--accent: aqua; --accent: #d54455;
--green: #5cff85; --green: #5cff85;
--text: white; --text: white;
--foreground: #222; --foreground: #222;

View File

@ -0,0 +1,14 @@
/* Libreddit black theme setting */
.libredditBlack {
--accent: #009a9a;
--green: #00a229;
--text: white;
--foreground: #0f0f0f;
--background: black;
--outside: black;
--post: black;
--panel-border: 2px solid #0f0f0f;
--highlighted: #0f0f0f;
--visited: #aaa;
--shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
}

View File

@ -0,0 +1,14 @@
/* Libreddit dark theme setting */
.libredditDark{
--accent: aqua;
--green: #5cff85;
--text: white;
--foreground: #222;
--background: #0f0f0f;
--outside: #1f1f1f;
--post: #161616;
--panel-border: 1px solid #333;
--highlighted: #333;
--visited: #aaa;
--shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
}

View File

@ -0,0 +1,19 @@
/* Libreddit light theme setting */
.libredditLight {
--accent: #009a9a;
--green: #00a229;
--text: black;
--foreground: #f5f5f5;
--background: #ddd;
--outside: #ececec;
--post: #eee;
--panel-border: 1px solid #ccc;
--highlighted: white;
--visited: #555;
--shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
}
html:has(> .libredditLight) {
/* Hint color theme to browser for scrollbar */
color-scheme: light;
}

View File

@ -1,6 +1,6 @@
/* Light theme setting */ /* Light theme setting */
.light { .light {
--accent: #009a9a; --accent: #bb2b3b;
--green: #00a229; --green: #00a229;
--text: black; --text: black;
--foreground: #f5f5f5; --foreground: #f5f5f5;

View File

@ -115,7 +115,7 @@ let ffmpeg = null;
var filename = toSlug(videoElement.parentNode.parentNode.id || document.title) var filename = toSlug(videoElement.parentNode.parentNode.id || document.title)
const data = await ffmpeg.readFile('output.mp4'); const data = await ffmpeg.readFile('output.mp4');
saveAs(new Blob([data.buffer]),filename); saveAs(new Blob([data.buffer]),filename, {type: 'video/mp4'});
return return
} }
function saveAs(blob, filename) { // Yeah ok... function saveAs(blob, filename) { // Yeah ok...

View File

@ -8,6 +8,9 @@
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta name="description" content="View on Redlib, an alternative private front-end to Reddit."> <meta name="description" content="View on Redlib, an alternative private front-end to Reddit.">
<meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta name="viewport" content="width=device-width, initial-scale=1.0">
{% if crate::utils::disable_indexing() %}
<meta name="robots" content="noindex, nofollow">
{% endif %}
<!-- General PWA --> <!-- General PWA -->
<meta name="theme-color" content="#1F1F1F"> <meta name="theme-color" content="#1F1F1F">
<!-- iOS Application --> <!-- iOS Application -->
@ -24,18 +27,20 @@
<link rel="manifest" type="application/json" href="/manifest.json"> <link rel="manifest" type="application/json" href="/manifest.json">
<link rel="shortcut icon" type="image/x-icon" href="/favicon.ico"> <link rel="shortcut icon" type="image/x-icon" href="/favicon.ico">
<link rel="stylesheet" type="text/css" href="/style.css?v={{ env!("CARGO_PKG_VERSION") }}"> <link rel="stylesheet" type="text/css" href="/style.css?v={{ env!("CARGO_PKG_VERSION") }}">
<!-- Video quality -->
<div id="video_quality" data-value="{{ prefs.video_quality }}"></div>
{% endblock %} {% endblock %}
</head> </head>
<body class=" <body class="
{% if prefs.layout != "" %}{{ prefs.layout }}{% endif %} {% if prefs.layout != "" %}{{ prefs.layout }}{% endif %}
{% if prefs.wide == "on" %} wide{% endif %} {% if prefs.wide == "on" || prefs.layout == "old" || prefs.layout == "waterfall" %} wide{% endif %}
{% if prefs.theme != "system" %} {{ prefs.theme }}{% endif %} {% if prefs.theme != "system" %} {{ prefs.theme }}{% endif %}
{% if prefs.fixed_navbar == "on" %} fixed_navbar{% endif %}"> {% if prefs.fixed_navbar == "on" %} fixed_navbar{% endif %}">
<!-- NAVIGATION BAR --> <!-- NAVIGATION BAR -->
<nav class=" <nav class="
{% if prefs.fixed_navbar == "on" %} fixed_navbar{% endif %}"> {% if prefs.fixed_navbar == "on" %} fixed_navbar{% endif %}">
<div id="logo"> <div id="logo">
<a id="redlib" href="/"><span id="lib">red</span><span id="reddit">sun</span><span id="lib">lib</span></a> <a id="redlib" href="/"><span id="lib">red</span><span id="reddit">sun</span><span id="lib">lib.</span></a>
{% block subscriptions %}{% endblock %} {% block subscriptions %}{% endblock %}
</div> </div>
{% block search %}{% endblock %} {% block search %}{% endblock %}

View File

@ -24,7 +24,7 @@
{% if author.flair.flair_parts.len() > 0 %} {% if author.flair.flair_parts.len() > 0 %}
<small class="author_flair">{% call utils::render_flair(author.flair.flair_parts) %}</small> <small class="author_flair">{% call utils::render_flair(author.flair.flair_parts) %}</small>
{% endif %} {% endif %}
<a href="{{ post_link }}{{ id }}/?context=3" class="created" title="{{ created }}">{{ rel_time }}</a> <a href="{{ post_link }}{{ id }}/?context=3#{{ id }}" class="created" title="{{ created }}">{{ rel_time }}</a>
{% if edited.0 != "".to_string() %}<span class="edited" title="{{ edited.1 }}">edited {{ edited.0 }}</span>{% endif %} {% if edited.0 != "".to_string() %}<span class="edited" title="{{ edited.1 }}">edited {{ edited.0 }}</span>{% endif %}
{% if !awards.is_empty() && prefs.hide_awards != "on" %} {% if !awards.is_empty() && prefs.hide_awards != "on" %}
<span class="dot">&bull;</span> <span class="dot">&bull;</span>

View File

@ -6,9 +6,14 @@
<h1>{{ msg }}</h1> <h1>{{ msg }}</h1>
<h3><a href="https://www.redditstatus.com/">Reddit Status</a></h3> <h3><a href="https://www.redditstatus.com/">Reddit Status</a></h3>
<br /> <br />
<h3>Expected something to work? <a <h3 id="update-status"></h3>
href="https://github.com/redlib-org/redlib/issues/new?assignees=&labels=bug&projects=&template=bug_report.md&title=%F0%9F%90%9B+Bug+Report%3A+{{ msg }}">Report <br>
an issue</a></h3> <div id="git_commit" data-value="{{ crate::instance_info::INSTANCE_INFO.git_commit }}"></div>
<script src="/check_update.js"></script>
<h3>Expected something to work? Try a <a href="https://github.com/redlib-org/redlib/">upstream</a> instance.</h3>
<br />
<h3 id="issue_warning" >!! Do <b>NOT</b> open an issue on the redlib repository with a redsunlib specific issue !!</h3>
<br /> <br />
<h3>Head back <a href="/">home</a>?</h3> <h3>Head back <a href="/">home</a>?</h3>
</div> </div>

View File

@ -10,25 +10,34 @@
{% block content %} {% block content %}
<div id="column_one"> <div id="column_one">
<form id="search_sort"> <form id="search_sort">
<input id="search" type="text" name="q" placeholder="Search" value="{{ params.q|safe }}" title="Search redlib"> <div class="search_widget_divider_box">
{% if sub != "" %} <input id="search" type="text" name="q" placeholder="Search" value="{{ params.q|safe }}" title="Search redlib">
<div id="inside"> <div class="search_widget_divider_box">
<input type="checkbox" name="restrict_sr" id="restrict_sr" {% if params.restrict_sr != "" %}checked{% endif %}> {% if sub != "" %}
<label for="restrict_sr" class="search_label">in r/{{ sub }}</label> <div id="inside">
<input type="checkbox" name="restrict_sr" id="restrict_sr" {% if params.restrict_sr != "" %}checked{% endif %}>
<label for="restrict_sr" class="search_label">in r/{{ sub }}</label>
</div>
{% endif %}
{% if params.typed == "sr_user" %}<input type="hidden" name="type" value="sr_user">{% endif %}
<select id="sort_options" name="sort" title="Sort results by">
{% call utils::options(params.sort, ["relevance", "hot", "top", "new", "comments"], "") %}
</select>
{% if params.sort != "new" %}
<select id="timeframe" name="t" title="Timeframe">
{% call utils::options(params.t, ["hour", "day", "week", "month", "year", "all"], "all") %}
</select>
{% endif %}
</div>
</div> </div>
{% endif %}
{% if params.typed == "sr_user" %}<input type="hidden" name="type" value="sr_user">{% endif %} <button id="sort_submit" class="submit">
<select id="sort_options" name="sort" title="Sort results by"> <svg width="15" viewBox="0 0 110 100" fill="none" stroke-width="10" stroke-linecap="round">
{% call utils::options(params.sort, ["relevance", "hot", "top", "new", "comments"], "") %} <path d="M20 50 H100" />
</select>{% if params.sort != "new" %}<select id="timeframe" name="t" title="Timeframe"> <path d="M75 15 L100 50 L75 85" />
{% call utils::options(params.t, ["hour", "day", "week", "month", "year", "all"], "all") %} &rarr;
</select>{% endif %}<button id="sort_submit" class="submit"> </svg>
<svg width="15" viewBox="0 0 110 100" fill="none" stroke-width="10" stroke-linecap="round"> </button>
<path d="M20 50 H100" />
<path d="M75 15 L100 50 L75 85" />
&rarr;
</svg>
</button>
</form> </form>
{% if !is_filtered %} {% if !is_filtered %}

View File

@ -3,6 +3,10 @@
{% block title %}Redlib Settings{% endblock %} {% block title %}Redlib Settings{% endblock %}
{% block subscriptions %}
{% call utils::sub_list("") %}
{% endblock %}
{% block search %} {% block search %}
{% call utils::search("".to_owned(), "") %} {% call utils::search("".to_owned(), "") %}
{% endblock %} {% endblock %}
@ -37,17 +41,39 @@
<div class="prefs-group"> <div class="prefs-group">
<label for="layout">Layout:</label> <label for="layout">Layout:</label>
<select name="layout" id="layout"> <select name="layout" id="layout">
{% call utils::options(prefs.layout, ["card", "clean", "compact"], "card") %} {% call utils::options(prefs.layout, ["card", "clean", "compact", "old", "waterfall"], "card") %}
</select> </select>
</div> </div>
<div class="prefs-group"> <div class="prefs-group">
<label for="wide">Wide UI:</label> <label for="wide">Wide UI:</label>
<input type="hidden" value="off" name="wide"> <input type="hidden" value="off" name="wide">
<input type="checkbox" name="wide" id="wide" {% if prefs.wide == "on" %}checked{% endif %}> <input type="checkbox" name="wide" id="wide" {% if prefs.layout == "old" || prefs.layout == "waterfall" %}disabled{% endif %} {% if prefs.wide == "on" || prefs.layout == "old" || prefs.layout == "waterfall" %}checked{% endif %}>
{% if prefs.layout == "old" || prefs.layout == "waterfall" %}<span>ⓘ Wide UI is required for this layout</span>{% endif %}
</div>
<div class="prefs-group">
<label for="fixed_navbar">Keep navbar fixed</label>
<input type="hidden" value="off" name="fixed_navbar">
<input type="checkbox" name="fixed_navbar" {% if prefs.fixed_navbar == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="hide_sidebar_and_summary">Hide the summary and sidebar</label>
<input type="hidden" value="off" name="hide_sidebar_and_summary">
<input type="checkbox" name="hide_sidebar_and_summary" {% if prefs.hide_sidebar_and_summary == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="disable_visit_reddit_confirmation">Do not confirm before visiting content on Reddit</label>
<input type="hidden" value="off" name="disable_visit_reddit_confirmation">
<input type="checkbox" name="disable_visit_reddit_confirmation" {% if prefs.disable_visit_reddit_confirmation == "on" %}checked{% endif %}>
</div> </div>
</fieldset> </fieldset>
<fieldset> <fieldset>
<legend>Content</legend> <legend>Content</legend>
<div class="prefs-group">
<label for="video_quality">Video quality:</label>
<select name="video_quality" id="video_quality">
{% call utils::options(prefs.video_quality, ["best", "medium", "worst"], "best") %}
</select>
</div>
<div class="prefs-group"> <div class="prefs-group">
<label for="post_sort" title="Applies only to subreddit feeds">Default subreddit post sort:</label> <label for="post_sort" title="Applies only to subreddit feeds">Default subreddit post sort:</label>
<select name="post_sort"> <select name="post_sort">
@ -77,21 +103,24 @@
<input type="checkbox" name="blur_nsfw" id="blur_nsfw" {% if prefs.blur_nsfw == "on" %}checked{% endif %}> <input type="checkbox" name="blur_nsfw" id="blur_nsfw" {% if prefs.blur_nsfw == "on" %}checked{% endif %}>
</div> </div>
{% endif %} {% endif %}
<div class="prefs-group">
<label for="hide_awards">Hide awards</label>
<input type="hidden" value="off" name="hide_awards">
<input type="checkbox" name="hide_awards" id="hide_awards" {% if prefs.hide_awards == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="hide_score">Hide score</label>
<input type="hidden" value="off" name="hide_score">
<input type="checkbox" name="hide_score" id="hide_score" {% if prefs.hide_score == "on" %}checked{% endif %}>
</div>
</fieldset>
<fieldset>
<legend>Media</legend>
<div class="prefs-group"> <div class="prefs-group">
<label for="autoplay_videos">Autoplay videos</label> <label for="autoplay_videos">Autoplay videos</label>
<input type="hidden" value="off" name="autoplay_videos"> <input type="hidden" value="off" name="autoplay_videos">
<input type="checkbox" name="autoplay_videos" id="autoplay_videos" {% if prefs.autoplay_videos == "on" %}checked{% endif %}> <input type="checkbox" name="autoplay_videos" id="autoplay_videos" {% if prefs.autoplay_videos == "on" %}checked{% endif %}>
</div> </div>
<div class="prefs-group">
<label for="fixed_navbar">Keep navbar fixed</label>
<input type="hidden" value="off" name="fixed_navbar">
<input type="checkbox" name="fixed_navbar" {% if prefs.fixed_navbar == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="hide_sidebar_and_summary">Hide the summary and sidebar</label>
<input type="hidden" value="off" name="hide_sidebar_and_summary">
<input type="checkbox" name="hide_sidebar_and_summary" {% if prefs.hide_sidebar_and_summary == "on" %}checked{% endif %}>
</div>
<div class="prefs-group"> <div class="prefs-group">
<label for="use_hls">Use HLS for videos</label> <label for="use_hls">Use HLS for videos</label>
{% if prefs.ffmpeg_video_downloads != "on" %} {% if prefs.ffmpeg_video_downloads != "on" %}
@ -118,21 +147,6 @@
<input type="hidden" value="off" name="hide_hls_notification"> <input type="hidden" value="off" name="hide_hls_notification">
<input type="checkbox" name="hide_hls_notification" id="hide_hls_notification" {% if prefs.hide_hls_notification == "on" %}checked{% endif %}> <input type="checkbox" name="hide_hls_notification" id="hide_hls_notification" {% if prefs.hide_hls_notification == "on" %}checked{% endif %}>
</div> </div>
<div class="prefs-group">
<label for="hide_awards">Hide awards</label>
<input type="hidden" value="off" name="hide_awards">
<input type="checkbox" name="hide_awards" id="hide_awards" {% if prefs.hide_awards == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="hide_score">Hide score</label>
<input type="hidden" value="off" name="hide_score">
<input type="checkbox" name="hide_score" id="hide_score" {% if prefs.hide_score == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="disable_visit_reddit_confirmation">Do not confirm before visiting content on Reddit</label>
<input type="hidden" value="off" name="disable_visit_reddit_confirmation">
<input type="checkbox" name="disable_visit_reddit_confirmation" {% if prefs.disable_visit_reddit_confirmation == "on" %}checked{% endif %}>
</div>
</fieldset> </fieldset>
<input id="save" type="submit" value="Save"> <input id="save" type="submit" value="Save">
</div> </div>

View File

@ -134,7 +134,13 @@
</form> </form>
{% endif %} {% endif %}
</div> </div>
</div> {% if crate::utils::enable_rss() %}
<div id="sub_rss">
<a href="/r/{{ sub.name }}.rss" title="RSS feed for r/{{ sub.name }}">
<button class="subscribe">RSS feed</button >
</a>
</div>
{% endif %}
</div> </div>
</details> </details>
<details class="panel" id="sidebar" open> <details class="panel" id="sidebar" open>

View File

@ -1,76 +1,87 @@
{% extends "base.html" %} {% extends "base.html" %} {% import "utils.html" as utils %} {% block search %}
{% import "utils.html" as utils %} {% call utils::search("".to_owned(), "") %} {% endblock %} {% block title %}{{
user.name.replace("u/", "") }} (u/{{ user.name }}) - Redlib{% endblock %} {%
block subscriptions %} {% call utils::sub_list("") %} {% endblock %} {% block
body %}
<main>
{% if !is_filtered %}
<div id="column_one">
<form id="sort">
<div id="listing_options">
{% call utils::sort(["/user/", user.name.as_str()].concat(),
["overview", "comments", "submitted"], listing) %}
</div>
<select id="sort_select" name="sort">
{% call utils::options(sort.0, ["hot", "new", "top",
"controversial"], "") %}</select
>{% if sort.0 == "top" || sort.0 == "controversial" %}<select
id="timeframe"
name="t"
>
{% call utils::options(sort.1, ["hour", "day", "week", "month",
"year", "all"], "all") %}</select
>{% endif %}<button id="sort_submit" class="submit">
<svg
width="15"
viewBox="0 0 110 100"
fill="none"
stroke-width="10"
stroke-linecap="round"
>
<path d="M20 50 H100" />
<path d="M75 15 L100 50 L75 85" />
&rarr;
</svg>
</button>
</form>
{% block search %} {% if all_posts_hidden_nsfw %}
{% call utils::search("".to_owned(), "") %} <center>
{% endblock %} All posts are hidden because they are NSFW. Enable "Show NSFW posts"
in settings to view.
{% block title %}{{ user.name.replace("u/", "") }} (u/{{ user.name }}) - Redlib{% endblock %} </center>
{% endif %} {% if no_posts %}
{% block subscriptions %} <center>No posts were found.</center>
{% call utils::sub_list("") %} {% endif %} {% if all_posts_filtered %}
{% endblock %} <center>(All content on this page has been filtered)</center>
{% else %}
{% block body %} <div id="posts">
<main> {% for post in posts %} {% if post.flags.nsfw && prefs.show_nsfw !=
{% if !is_filtered %} "on" %} {% else if !post.title.is_empty() %} {% call
<div id="column_one"> utils::post_in_list(post) %} {% else %}
<form id="sort"> <div class="comment user-comment">
<div id="listing_options"> <div class="comment_left">
{% call utils::sort(["/user/", user.name.as_str()].concat(), ["overview", "comments", "submitted"], listing) %} <p class="comment_score" title="{{ post.score.1 }}">
</div> {% if prefs.hide_score != "on" %} {{ post.score.0 }} {%
<select id="sort_select" name="sort"> else %} &#x2022; {% endif %}
{% call utils::options(sort.0, ["hot", "new", "top", "controversial"], "") %}
</select>{% if sort.0 == "top" || sort.0 == "controversial" %}<select id="timeframe" name="t">
{% call utils::options(sort.1, ["hour", "day", "week", "month", "year", "all"], "all") %}
</select>{% endif %}<button id="sort_submit" class="submit">
<svg width="15" viewBox="0 0 110 100" fill="none" stroke-width="10" stroke-linecap="round">
<path d="M20 50 H100" />
<path d="M75 15 L100 50 L75 85" />
&rarr;
</svg>
</button>
</form>
{% if all_posts_hidden_nsfw %}
<center>All posts are hidden because they are NSFW. Enable "Show NSFW posts" in settings to view.</center>
{% endif %}
{% if no_posts %}
<center>No posts were found.</center>
{% endif %}
{% if all_posts_filtered %}
<center>(All content on this page has been filtered)</center>
{% else %}
<div id="posts">
{% for post in posts %}
{% if post.flags.nsfw && prefs.show_nsfw != "on" %}
{% else if !post.title.is_empty() %}
{% call utils::post_in_list(post) %}
{% else %}
<div class="comment">
<div class="comment_left">
<p class="comment_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %}
{{ post.score.0 }}
{% else %}
&#x2022;
{% endif %}
</p> </p>
<div class="line"></div> <div class="line"></div>
</div> </div>
<details class="comment_right" open> <details class="comment_right" open>
<summary class="comment_data"> <summary class="comment_data">
<a class="comment_link" href="{{ post.permalink }}">Comment on r/{{ post.community }}</a> <a
<span class="created" title="{{ post.created }}">{{ post.rel_time }}</span> class="comment_link"
</summary> href="{{ post.permalink }}#{{ post.id }}"
<p class="comment_body">{{ post.body|safe }}</p> title="{{ post.link_title }}"
</details> >{{ post.link_title }}</a
</div> >
{% endif %} <div class="user_comment_data_divider">
{% endfor %} <span class="created-in">&nbsp;in&nbsp;</span>
<a
class="comment_subreddit"
href="/r/{{ post.community }}"
>r/{{ post.community }}</a
>
<span class="dot">&bull;</span>
<span class="created" title="{{ post.created }}"
>&nbsp;{{ post.rel_time }}</span
>
</div>
</summary>
<p class="comment_body">{{ post.body|safe }}</p>
</details>
</div>
{% endif %} {% endfor %}
{% if prefs.ffmpeg_video_downloads == "on" %} {% if prefs.ffmpeg_video_downloads == "on" %}
<script src="/ffmpeg/ffmpeg.js"></script> <script src="/ffmpeg/ffmpeg.js"></script>
<script src="/ffmpeg/ffmpeg-util.js"></script> <script src="/ffmpeg/ffmpeg-util.js"></script>
@ -79,61 +90,94 @@
<script src="/hls.min.js"></script> <script src="/hls.min.js"></script>
<script src="/videoUtils.js"></script> <script src="/videoUtils.js"></script>
{% endif %} {% endif %}
</div> </div>
{% endif %} {% endif %}
<footer> <footer>
{% if ends.0 != "" %} {% if ends.0 != "" %}
<a href="?sort={{ sort.0 }}&t={{ sort.1 }}&before={{ ends.0 }}" accesskey="P">PREV</a> <a
{% endif %} href="?sort={{ sort.0 }}&t={{ sort.1 }}&before={{ ends.0 }}"
accesskey="P"
{% if ends.1 != "" %} >PREV</a
<a href="?sort={{ sort.0 }}&t={{ sort.1 }}&after={{ ends.1 }}" accesskey="N">NEXT</a> >
{% endif %} {% endif %} {% if ends.1 != "" %}
</footer> <a
</div> href="?sort={{ sort.0 }}&t={{ sort.1 }}&after={{ ends.1 }}"
{% endif %} accesskey="N"
<aside> >NEXT</a
{% if is_filtered %} >
<center>(Content from u/{{ user.name }} has been filtered)</center> {% endif %}
{% endif %} </footer>
<div class="panel" id="user"> </div>
<img loading="lazy" id="user_icon" src="{{ user.icon }}" alt="User icon"> {% endif %}
<h1 id="user_title">{{ user.title }}</h1> <aside>
<p id="user_name">u/{{ user.name }}</p> {% if is_filtered %}
<div id="user_description">{{ user.description }}</div> <center>(Content from u/{{ user.name }} has been filtered)</center>
<div id="user_details"> {% endif %}
<label>Karma</label> <div class="panel" id="user">
<label>Created</label> <img
<div>{{ user.karma }}</div> loading="lazy"
<div>{{ user.created }}</div> id="user_icon"
</div> src="{{ user.icon }}"
<div id="user_actions"> alt="User icon"
{% let name = ["u_", user.name.as_str()].join("") %} />
<div id="user_subscription"> <h1 id="user_title">{{ user.title }}</h1>
{% if prefs.subscriptions.contains(name) %} <p id="user_name">u/{{ user.name }}</p>
<form action="/r/{{ name }}/unsubscribe?redirect={{ redirect_url }}" method="POST"> <div id="user_description">{{ user.description }}</div>
<button class="unsubscribe">Unfollow</button> <div id="user_details">
</form> <label>Karma</label>
{% else %} <label>Created</label>
<form action="/r/{{ name }}/subscribe?redirect={{ redirect_url }}" method="POST"> <div>{{ user.karma }}</div>
<button class="subscribe">Follow</button> <div>{{ user.created }}</div>
</form> </div>
{% endif %} <div id="user_actions">
</div> {% let name = ["u_", user.name.as_str()].join("") %}
<div id="user_filter"> <div id="user_subscription">
{% if prefs.filters.contains(name) %} {% if prefs.subscriptions.contains(name) %}
<form action="/r/{{ name }}/unfilter?redirect={{ redirect_url }}" method="POST"> <form
<button class="unfilter">Unfilter</button> action="/r/{{ name }}/unsubscribe?redirect={{ redirect_url }}"
</form> method="POST"
{% else %} >
<form action="/r/{{ name }}/filter?redirect={{ redirect_url }}" method="POST"> <button class="unsubscribe">Unfollow</button>
<button class="filter">Filter</button> </form>
</form> {% else %}
{% endif %} <form
</div> action="/r/{{ name }}/subscribe?redirect={{ redirect_url }}"
</div> method="POST"
</div> >
</aside> <button class="subscribe">Follow</button>
</main> </form>
{% endif %}
</div>
<div id="user_filter">
{% if prefs.filters.contains(name) %}
<form
action="/r/{{ name }}/unfilter?redirect={{ redirect_url }}"
method="POST"
>
<button class="unfilter">Unfilter</button>
</form>
{% else %}
<form
action="/r/{{ name }}/filter?redirect={{ redirect_url }}"
method="POST"
>
<button class="filter">Filter</button>
</form>
{% endif %}
</div>
{% if crate::utils::enable_rss() %}
<div id="user_rss">
<a
href="/u/{{ user.name }}.rss"
title="RSS feed for u/{{ user.name }}"
>
<button class="subscribe">RSS feed</button>
</a>
</div>
{% endif %}
</div>
</div>
</aside>
</main>
{% endblock %} {% endblock %}

View File

@ -64,7 +64,7 @@
{% macro post(post) -%} {% macro post(post) -%}
{% set post_should_be_blurred = post.flags.spoiler && prefs.blur_spoiler=="on" -%} {% set post_should_be_blurred = post.flags.spoiler && prefs.blur_spoiler=="on" -%}
<!-- POST CONTENT --> <!-- POST CONTENT -->
<div class="post highlighted"> <div class="post highlighted{% if post_should_be_blurred %} post_blurred{% endif %}">
<p class="post_header"> <p class="post_header">
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a> <a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
<span class="dot">&bull;</span> <span class="dot">&bull;</span>
@ -87,12 +87,12 @@
{% endif %} {% endif %}
</p> </p>
<h1 class="post_title"> <h1 class="post_title">
{{ post.title }}
{% if post.flair.flair_parts.len() > 0 %} {% if post.flair.flair_parts.len() > 0 %}
<a href="/r/{{ post.community }}/search?q=flair_name%3A%22{{ post.flair.text }}%22&restrict_sr=on" <a href="/r/{{ post.community }}/search?q=flair_name%3A%22{{ post.flair.text }}%22&restrict_sr=on"
class="post_flair" class="post_flair"
style="color:{{ post.flair.foreground_color }}; background:{{ post.flair.background_color }};">{% call render_flair(post.flair.flair_parts) %}</a> style="color:{{ post.flair.foreground_color }}; background:{{ post.flair.background_color }};">{% call render_flair(post.flair.flair_parts) %}</a>
{% endif %} {% endif %}
{{ post.title }}
{% if post.flags.nsfw %} <small class="nsfw">NSFW</small>{% endif %} {% if post.flags.nsfw %} <small class="nsfw">NSFW</small>{% endif %}
{% if post.flags.spoiler %} <small class="spoiler">Spoiler</small>{% endif %} {% if post.flags.spoiler %} <small class="spoiler">Spoiler</small>{% endif %}
</h1> </h1>
@ -104,12 +104,11 @@
<a href="{{ post.media.url }}" class="post_media_image" > <a href="{{ post.media.url }}" class="post_media_image" >
{% if post.media.height == 0 || post.media.width == 0 %} {% if post.media.height == 0 || post.media.width == 0 %}
<!-- i.redd.it images special case --> <!-- i.redd.it images special case -->
<img width="100%" height="100%" loading="lazy" alt="Post image" src="{{ post.media.url }}"{%if post_should_be_blurred %} class="post_nsfw_blur"{% endif %}/> <img width="100%" height="100%" loading="lazy" alt="Post image" src="{{ post.media.url }}"/>
{% else %} {% else %}
<svg <svg
width="{{ post.media.width }}px" width="{{ post.media.width }}px"
height="{{ post.media.height }}px" height="{{ post.media.height }}px"
{%if post_should_be_blurred %}class="post_nsfw_blur"{% endif %}
xmlns="http://www.w3.org/2000/svg"> xmlns="http://www.w3.org/2000/svg">
<image width="100%" height="100%" href="{{ post.media.url }}"/> <image width="100%" height="100%" href="{{ post.media.url }}"/>
<desc> <desc>
@ -127,7 +126,7 @@
{% if prefs.use_hls == "on" && !post.media.alt_url.is_empty() || prefs.ffmpeg_video_downloads == "on" && !post.media.alt_url.is_empty() %} {% if prefs.use_hls == "on" && !post.media.alt_url.is_empty() || prefs.ffmpeg_video_downloads == "on" && !post.media.alt_url.is_empty() %}
<script src="/hls.min.js"></script> <script src="/hls.min.js"></script>
<div class="post_media_content"> <div class="post_media_content">
<video class="post_media_video short {% if prefs.autoplay_videos == "on" %}hls_autoplay{% endif %}{%if post_should_be_blurred %} post_nsfw_blur{% endif %}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" preload="none" controls> <video class="post_media_video short {% if prefs.autoplay_videos == "on" %}hls_autoplay{% endif %}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" preload="none" controls>
<source src="{{ post.media.alt_url }}" type="application/vnd.apple.mpegurl" /> <source src="{{ post.media.alt_url }}" type="application/vnd.apple.mpegurl" />
<source src="{{ post.media.url }}" type="video/mp4" /> <source src="{{ post.media.url }}" type="video/mp4" />
</video> </video>
@ -135,7 +134,7 @@
<script src="/videoUtils.js"></script> <script src="/videoUtils.js"></script>
{% else %} {% else %}
<div class="post_media_content"> <div class="post_media_content">
<video class="post_media_video{%if post_should_be_blurred %} post_nsfw_blur{% endif %}" src="{{ post.media.url }}" controls {% if prefs.autoplay_videos == "on" %}autoplay{% endif %} loop><a href={{ post.media.url }}>Video</a></video> <video class="post_media_video" src="{{ post.media.url }}" controls {% if prefs.autoplay_videos == "on" %}autoplay{% endif %} loop><a href={{ post.media.url }}>Video</a></video>
</div> </div>
{% call render_hls_notification(post.permalink[1..]) %} {% call render_hls_notification(post.permalink[1..]) %}
{% endif %} {% endif %}
@ -158,7 +157,10 @@
{% endif %} {% endif %}
<!-- POST BODY --> <!-- POST BODY -->
<div class="post_body">{{ post.body|safe }}</div> <div class="post_body">
{{ post.body|safe }}
{% call poll(post) %}
</div>
<div class="post_score" title="{{ post.score.1 }}"> <div class="post_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %} {% if prefs.hide_score != "on" %}
{{ post.score.0 }} {{ post.score.0 }}
@ -168,13 +170,32 @@
<span class="label"> Upvotes</span></div> <span class="label"> Upvotes</span></div>
<div class="post_footer"> <div class="post_footer">
<ul id="post_links"> <ul id="post_links">
<li class="desktop_item"><a href="{{ post.permalink }}">permalink</a></li> <li>
<li class="mobile_item"><a href="{{ post.permalink }}">link</a></li> <a href="{{ post.permalink }}">
<span class="desktop_item">perma</span>link
</a>
</li>
{% if post.num_duplicates > 0 %} {% if post.num_duplicates > 0 %}
<li class="desktop_item"><a href="/r/{{ post.community }}/duplicates/{{ post.id }}">duplicates</a></li> <li>
<li class="mobile_item"><a href="/r/{{ post.community }}/duplicates/{{ post.id }}">dupes</a></li> <a href="/r/{{ post.community }}/duplicates/{{ post.id }}">
dup<span class="desktop_item">licat</span>es
</a>
</li>
{% endif %}
{% if post.post_type == "link" %}
<li class="desktop_item"><a target="_blank" href="https://archive.is/latest/{{ post.media.url }}">archive.is</a></li>
<li class="mobile_item"><a target="_blank" href="https://archive.is/latest/{{ post.media.url }}">archive</a></li>
{% endif %} {% endif %}
{% call external_reddit_link(post.permalink) %} {% call external_reddit_link(post.permalink) %}
{% if post.media.download_name != "" %}
<li>
<a href="{{ post.media.url }}" download="{{ post.media.download_name }}">
<span class="mobile_item">dl</span>
<span class="desktop_item">download</span>
</a>
</li>
{% endif %}
</ul> </ul>
<p>{{ post.upvote_ratio }}%<span id="upvoted"> Upvoted</span></p> <p>{{ post.upvote_ratio }}%<span id="upvoted"> Upvoted</span></p>
</div> </div>
@ -182,8 +203,7 @@
{%- endmacro %} {%- endmacro %}
{% macro external_reddit_link(permalink) %} {% macro external_reddit_link(permalink) %}
{% for dev_type in ["desktop", "mobile"] %} <li>
<li class="{{ dev_type }}_item">
<a <a
{% if prefs.disable_visit_reddit_confirmation != "on" %} {% if prefs.disable_visit_reddit_confirmation != "on" %}
href="#popup" href="#popup"
@ -197,12 +217,11 @@
{% call visit_reddit_confirmation(permalink) %} {% call visit_reddit_confirmation(permalink) %}
{% endif %} {% endif %}
</li> </li>
{% endfor %}
{% endmacro %} {% endmacro %}
{% macro post_in_list(post) -%} {% macro post_in_list(post) -%}
{% set post_should_be_blurred = (post.flags.nsfw && prefs.blur_nsfw=="on") || (post.flags.spoiler && prefs.blur_spoiler=="on") -%} {% set post_should_be_blurred = (post.flags.nsfw && prefs.blur_nsfw=="on") || (post.flags.spoiler && prefs.blur_spoiler=="on") -%}
<div class="post {% if post.flags.stickied %}stickied{% endif %}" id="{{ post.id }}"> <div class="post{% if post.flags.stickied %} stickied{% endif %}{% if post_should_be_blurred %} post_blurred{% endif %}" id="{{ post.id }}">
<p class="post_header"> <p class="post_header">
{% let community -%} {% let community -%}
{% if post.community.starts_with("u_") -%} {% if post.community.starts_with("u_") -%}
@ -233,7 +252,7 @@
<a href="{{ post.permalink }}">{{ post.title }}</a>{% if post.flags.nsfw %} <small class="nsfw">NSFW</small>{% endif %}{% if post.flags.spoiler %} <small class="spoiler">Spoiler</small>{% endif %} <a href="{{ post.permalink }}">{{ post.title }}</a>{% if post.flags.nsfw %} <small class="nsfw">NSFW</small>{% endif %}{% if post.flags.spoiler %} <small class="spoiler">Spoiler</small>{% endif %}
</h2> </h2>
<!-- POST MEDIA/THUMBNAIL --> <!-- POST MEDIA/THUMBNAIL -->
{% if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "image" %} {% if (prefs.layout.is_empty() || prefs.layout == "card" || prefs.layout == "waterfall") && post.post_type == "image" %}
<div class="post_media_content"> <div class="post_media_content">
<a href="{{ post.media.url }}" class="post_media_image {% if post.media.height < post.media.width*2 %}short{% endif %}" > <a href="{{ post.media.url }}" class="post_media_image {% if post.media.height < post.media.width*2 %}short{% endif %}" >
{% if post.media.height == 0 || post.media.width == 0 %} {% if post.media.height == 0 || post.media.width == 0 %}
@ -241,7 +260,6 @@
<img width="100%" height="100%" loading="lazy" alt="Post image" src="{{ post.media.url }}"/> <img width="100%" height="100%" loading="lazy" alt="Post image" src="{{ post.media.url }}"/>
{% else %} {% else %}
<svg <svg
{%if post_should_be_blurred %}class="post_nsfw_blur"{% endif %}
width="{{ post.media.width }}px" width="{{ post.media.width }}px"
height="{{ post.media.height }}px" height="{{ post.media.height }}px"
xmlns="http://www.w3.org/2000/svg"> xmlns="http://www.w3.org/2000/svg">
@ -253,26 +271,22 @@
{% endif %} {% endif %}
</a> </a>
</div> </div>
{% else if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "gif" %} {% else if (prefs.layout.is_empty() || prefs.layout == "card" || prefs.layout == "waterfall") && (post.post_type == "gif" || post.post_type == "video") %}
<div class="post_media_content">
<video class="post_media_video short {%if post_should_be_blurred %}post_nsfw_blur{% endif %}" src="{{ post.media.url }}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" preload="none" controls loop {% if prefs.autoplay_videos == "on" %}autoplay{% endif %}><a href={{ post.media.url }}>Video</a></video>
</div>
{% else if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "video" %}
{% if prefs.use_hls == "on" && !post.media.alt_url.is_empty() || prefs.ffmpeg_video_downloads == "on" && !post.media.alt_url.is_empty() %} {% if prefs.use_hls == "on" && !post.media.alt_url.is_empty() || prefs.ffmpeg_video_downloads == "on" && !post.media.alt_url.is_empty() %}
<div class="post_media_content"> <div class="post_media_content">
<video class="post_media_video short {%if post_should_be_blurred %}post_nsfw_blur{% endif %} {% if prefs.autoplay_videos == "on" %}hls_autoplay{% endif %}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" controls preload="none"> <video class="post_media_video short{% if prefs.autoplay_videos == "on" %} hls_autoplay{% endif %}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" controls preload="none">
<source src="{{ post.media.alt_url }}" type="application/vnd.apple.mpegurl" /> <source src="{{ post.media.alt_url }}" type="application/vnd.apple.mpegurl" />
<source src="{{ post.media.url }}" type="video/mp4" /> <source src="{{ post.media.url }}" type="video/mp4" />
</video> </video>
</div> </div>
{% else %} {% else %}
<div class="post_media_content"> <div class="post_media_content">
<video class="post_media_video short {%if post_should_be_blurred %}post_nsfw_blur{% endif %}" src="{{ post.media.url }}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" preload="none" controls {% if prefs.autoplay_videos == "on" %}autoplay{% endif %}><a href={{ post.media.url }}>Video</a></video> <video class="post_media_video short" src="{{ post.media.url }}" {% if post.media.width > 0 && post.media.height > 0 %}width="{{ post.media.width }}" height="{{ post.media.height }}"{% endif %} poster="{{ post.media.poster }}" preload="none" controls {% if prefs.autoplay_videos == "on" %}autoplay{% endif %}><a href={{ post.media.url }}>Video</a></video>
</div> </div>
{% call render_hls_notification(format!("{}%23{}", &self.url[1..].replace("&", "%26").replace("+", "%2B"), post.id)) %} {% call render_hls_notification(format!("{}%23{}", &self.url[1..].replace("&", "%26").replace("+", "%2B"), post.id)) %}
{% endif %} {% endif %}
{% else if post.post_type != "self" %} {% else if post.post_type != "self" %}
<a class="post_thumbnail {% if post.thumbnail.url.is_empty() %}no_thumbnail{% endif %}" href="{% if post.post_type == "link" %}{{ post.media.url }}{% else %}{{ post.permalink }}{% endif %}" rel="nofollow"> <a class="post_thumbnail{% if post.thumbnail.url.is_empty() %} no_thumbnail{% endif %}" href="{% if post.post_type == "link" %}{{ post.media.url }}{% else %}{{ post.permalink }}{% endif %}" rel="nofollow">
{% if post.thumbnail.url.is_empty() %} {% if post.thumbnail.url.is_empty() %}
<svg viewBox="0 0 100 106" width="140" height="53" xmlns="http://www.w3.org/2000/svg"> <svg viewBox="0 0 100 106" width="140" height="53" xmlns="http://www.w3.org/2000/svg">
<title>Thumbnail</title> <title>Thumbnail</title>
@ -280,7 +294,7 @@
</svg> </svg>
{% else %} {% else %}
<div style="max-width:{{ post.thumbnail.width }}px;max-height:{{ post.thumbnail.height }}px;"> <div style="max-width:{{ post.thumbnail.width }}px;max-height:{{ post.thumbnail.height }}px;">
<svg {% if post_should_be_blurred %} class="thumb_nsfw_blur" {% endif %} width="{{ post.thumbnail.width }}px" height="{{ post.thumbnail.height }}px" xmlns="http://www.w3.org/2000/svg"> <svg width="{{ post.thumbnail.width }}px" height="{{ post.thumbnail.height }}px" xmlns="http://www.w3.org/2000/svg">
<image width="100%" height="100%" href="{{ post.thumbnail.url }}"/> <image width="100%" height="100%" href="{{ post.thumbnail.url }}"/>
<desc> <desc>
<img loading="lazy" alt="Thumbnail" src="{{ post.thumbnail.url }}"/> <img loading="lazy" alt="Thumbnail" src="{{ post.thumbnail.url }}"/>