Compare commits

...

183 Commits

Author SHA1 Message Date
c597a20311 Add websockets URL to parsing 2024-01-03 20:06:08 -05:00
89ba46e15d Redirect and proxy redditstatic gifs in-body (fix #14) 2024-01-03 09:36:19 -05:00
3dee29f3ef Add scrolling to highlighted comment (fix #13) 2024-01-02 19:43:00 -05:00
dea805936c Fix preview URL (fixes libreddit/libreddit/issues/559) 2024-01-02 19:21:24 -05:00
0c79cefed7 Fix publish action 2024-01-02 19:05:36 -05:00
5cb36ee15d v0.31.0 2024-01-02 18:45:19 -05:00
5bdcf64237 Update links and branches 2024-01-02 16:45:31 -05:00
3755f0cb24 README images (fix #9) 2024-01-02 00:39:11 -05:00
3145a6286b Update test runner to *run* cargo nextest run 2023-12-31 13:26:46 -05:00
c2e650b03b Update test runner to cargo nextest run 2023-12-30 21:46:37 -05:00
6d97f4c8dd Change Tokio tests - fix GHA runner (again) 2023-12-30 21:33:27 -05:00
cd836308db Update oauth.rs to use Android client only (fixes #8) 2023-12-30 17:32:54 -05:00
d327ab2c95 Small changes to params generation in subreddit.rs 2023-12-30 17:10:46 -05:00
53e8811f32 Remove all stats tracking (fixes #7) 2023-12-30 10:22:49 -05:00
d86b77ab56 Reset test threads to 1 (should fix test issues in GHA) 2023-12-29 20:33:43 -05:00
90a800ff44 Remove share parameters at canonical_path 2023-12-29 19:34:57 -05:00
45d8f1bbc8 Better handle redirects with new OAuth endpoints 2023-12-29 19:28:41 -05:00
3a4a39f577 Add config tests 2023-12-28 19:15:00 -05:00
ce0c6eca8a Fix obfuscated link handling 2023-12-28 18:21:07 -05:00
878ef8e95e Change formatting of autogenerated script 2023-12-28 17:36:45 -05:00
05f9d4f3bd Add count to update_oauth_resources.sh 2023-12-28 15:53:09 -05:00
0955f902f8 Fix update_oauth_resources.sh 2023-12-28 15:49:29 -05:00
d4c4d61ce8 Merge pull request #5 from redlib-org/improve_spoofing
Improve spoofing
2023-12-28 15:44:52 -05:00
c1214939ef Add requirements to scripts/update_oauth_resources.sh 2023-12-28 15:38:31 -05:00
9f41af6eee Improve spoofing - match headers more closely, pull in real versions/builds 2023-12-28 15:37:02 -05:00
4461a7d172 Only rerun build script if src/ changes 2023-12-28 15:21:06 -05:00
9850109326 Minor stylistic changes 2023-12-28 12:42:06 -05:00
bfe1c3db57 Fix Settings HTML 2023-12-28 11:35:06 -05:00
28f39329ad Add package name to instance_info in order to identify redlib instances 2023-12-28 11:21:56 -05:00
1316c8491c Merge pull request #4 from redlib-org/fix_popular_localization
Fix /r/popular localization - set geo filter to global
2023-12-28 10:43:10 -05:00
42902cc8d0 Add test for popular globalization 2023-12-28 10:40:17 -05:00
b43ed01958 Fix /r/popular localization - set geo filter to global 2023-12-28 10:30:42 -05:00
7d952f7f18 [FIX] Readme fix commands 2023-12-27 00:20:22 -05:00
819be89f84 Update cargo.toml 2023-12-27 00:19:54 -05:00
5f84d12774 0.30.2 (update config handling + test GH release) 2023-12-27 00:11:24 -05:00
5e68a66e40 Accept legacy config files 2023-12-27 00:11:06 -05:00
457b0bd57e Accept legacy environment variables 2023-12-26 23:16:36 -05:00
47822d8d6c Fix clippy warning 2023-12-26 23:15:06 -05:00
e452b8d6b5 Update oauth::choose to use new fastrand::choose_multiple 2023-12-26 20:00:36 -05:00
09df7713b1 Fix more README links 2023-12-26 19:55:37 -05:00
a2ca895e1f Fix docker repo URL - GHA 2023-12-26 19:09:54 -05:00
3fd3d4e145 Update action versions :fingers_crossed: 2023-12-26 19:01:45 -05:00
4c78ab30d3 More README changes, modify contrib/ 2023-12-26 18:48:09 -05:00
c5d11f220e Fix clippy warnings 2023-12-26 18:27:25 -05:00
b0f985c687 Libreddit -> Redlib 2023-12-26 18:25:52 -05:00
dac059573d Update rest of Cargo.toml, excepting askama, futures-lite, hyper 2023-12-26 17:50:56 -05:00
3e3c30d7f1 Update cookie + changes 2023-12-26 16:24:53 -05:00
e82c3fbea0 Update 11-15 of deps 2023-12-26 16:12:42 -05:00
d9f7ebcb79 Update util calls 2023-12-26 16:12:00 -05:00
36357e2609 Cargo update 2023-12-26 15:55:35 -05:00
b7bf9c74be Fix import error 2023-12-26 15:54:43 -05:00
1c36467c9c Merge remote-tracking branch 'origin/pull/867' 2023-12-26 15:52:53 -05:00
d76051302e Merge remote-tracking branch 'origin/pull/738' 2023-12-26 15:51:15 -05:00
90d1831352 Merge remote-tracking branch 'origin/pull/819' 2023-12-26 15:48:27 -05:00
3ac2048247 Merge remote-tracking branch 'origin/pull/861' 2023-12-26 15:47:33 -05:00
902acb257d Merge remote-tracking branch 'origin/pull/176' 2023-12-26 15:47:12 -05:00
28611da602 Add seccomp (merge 441) 2023-12-26 15:46:20 -05:00
f26c8be931 Add TokyoNight - Merge 450 2023-12-26 15:45:12 -05:00
de268314f3 Fix tests 2023-12-26 15:42:41 -05:00
8c9565c57b Readme values 2023-12-26 15:22:34 -05:00
0eb5e18cef Merge remote-tracking branch 'origin/pull/536' 2023-12-26 15:20:21 -05:00
fc4b686607 Merge remote-tracking branch 'origin/pull/746' 2023-12-26 15:18:49 -05:00
c17af7db75 Small changes to table and html 2023-12-26 15:18:29 -05:00
6625d106c3 Merge remote-tracking branch 'origin/pull/753' 2023-12-26 15:17:23 -05:00
82fdcf7443 Merge remote-tracking branch 'origin/pull/768' 2023-12-26 15:15:06 -05:00
2a525b744a Merge remote-tracking branch 'origin/pull/854' 2023-12-26 15:14:46 -05:00
c71a9dddd7 Merge remote-tracking branch 'origin/pull/857' 2023-12-26 15:13:46 -05:00
cc9023dc64 Merge remote-tracking branch 'origin/pull/865' 2023-12-26 15:12:36 -05:00
f5b54197c4 Merge remote-tracking branch 'origin/pull/808' 2023-12-26 15:11:44 -05:00
9b71822be6 Match on both http and https in format_url (414) 2023-12-26 15:11:16 -05:00
9d948abadc Merge pull request #831 from bennettmsherman/header-filters
Remove Reddit's 'Nel' and 'Report-To' (network error logging) response headers
2023-11-29 09:25:16 -05:00
2d64c092ea Fix short links again. Just using a split 2023-11-21 21:34:13 -08:00
2b06d22687 Made userpage posts show subreddit name
Made userpage posts show subreddit name instead of ambigious `COMMENT`
2023-11-08 13:13:46 +02:00
3e236e7ab5 client.rs: remove some String allocations 2023-10-27 09:05:22 -04:00
469aff0689 Handle obfuscated share links 2023-10-04 09:55:33 -07:00
dd611b17ad Add start URL to manifest.json
Adds the last criteria to make the application an installable PWA on mobile.
2023-09-17 23:00:01 -07:00
b39db0fcd4 Make launchd service for macOS 2023-08-23 13:14:32 -04:00
2815dc5209 Correct the shutdown announcement 2023-07-14 12:05:57 -07:00
00697c6ae4 Add shut down announcement 2023-07-14 11:57:25 -07:00
7a14975fb8 Remove 'Nel' and 'Report-To' response headers 2023-07-08 19:20:58 -07:00
ea696687be Merge pull request #821 from fawni/feat/hide-subreddit-panel 2023-06-09 19:01:57 -04:00
136aa0aa7d Format 2023-06-09 17:32:21 -04:00
a39bb9d502 Merge branch 'master' into reddit-stats 2023-06-09 17:31:12 -04:00
5f562876f4 Make stats collection opt-out 2023-06-09 17:26:23 -04:00
f7f1aa4bde Abstract out random choosing 2023-06-08 16:27:36 -04:00
c00beaa5d8 Improve OAuth refresh, logging 2023-06-08 14:33:54 -04:00
49dde7ad72 Improve subreddit test 2023-06-08 14:06:58 -04:00
13394b4a5e Add ability to hide subreddit panel (closes #801) 2023-06-07 13:51:27 +03:00
0ca0eefaa4 Add tests to check fetching sub/user/oauth 2023-06-06 15:28:36 -04:00
6cd53abd42 Documentation 2023-06-06 15:26:31 -04:00
dc7601375e Ignore dotenv failure 2023-06-06 15:07:11 -04:00
659a82bf63 Improve spoofing of devices, handle token refreshes 2023-06-06 15:05:20 -04:00
a5833dc05c Add .env to .gitignore 2023-06-06 15:04:06 -04:00
e94a9c81e2 Add deps - rand, logging 2023-06-06 14:33:01 -04:00
8a23616920 Stray space 2023-06-05 20:57:34 -04:00
00355de727 Set proper headers 2023-06-05 20:39:56 -04:00
383d2789ce Initial PoC of spoofing Android OAuth 2023-06-05 20:31:25 -04:00
ba89b76332 Merge pull request #814 from Tokarak/deps-update 2023-06-04 18:14:27 -04:00
96e9e0ea9f Update .replit to download from nightly build artifacts (#815) 2023-06-03 23:36:39 +00:00
c1dd1a091e Update release binary paths 2023-06-03 16:30:58 -04:00
05ae39f743 Update RUSTFLAGS 2023-06-03 16:15:24 -04:00
221260c282 Remove MUSL, build statically via flags 2023-06-03 16:12:48 -04:00
f3c835bee7 Proof-read README.md 2023-06-03 20:02:02 +01:00
f9fd54aa3c Specify newer dependencies + cargo update 2023-06-03 19:41:32 +01:00
510d967777 Add MUSL target 2023-06-03 14:33:27 -04:00
0bcebff6f2 Fix YAML formatting 2023-06-03 14:24:19 -04:00
0c74305617 Add MUSL builds to GH Actions and fix Release event trigger (#810) 2023-06-03 18:19:20 +00:00
97f0f69059 Rebase #811 (#812)
Co-authored-by: Matthew Esposito <matt@matthew.science>
2023-06-03 17:32:46 +00:00
255307a4f7 Add request stats to instance info HTML 2023-05-31 20:02:00 -04:00
b5fc4bef28 Fix github-actions versioning 2023-05-31 19:50:38 -04:00
81a6e6458c ci: cleanup github actions (#803) 2023-05-31 23:47:58 +00:00
de68409610 Add request stats to instance info page 2023-05-31 19:39:44 -04:00
193a6effbf Merge pull request #792 from beucismis/master 2023-05-31 18:42:39 -04:00
09551fca29 Merge pull request #806 from gmnsii/comment-searchbar-color 2023-05-31 18:40:25 -04:00
38ee0d9428 make comment search bar color change based on theme 2023-05-31 19:41:13 +02:00
ca7ad9f812 Merge pull request #796 from StuffNoOneCaresAbout/lazy-init-regex 2023-05-01 10:09:59 -04:00
98e2833881 Merge pull request #790 from StuffNoOneCaresAbout/allow-disabling-indexing 2023-05-01 10:08:20 -04:00
4d5c52b83b Rename variables to more descriptive names. 2023-05-01 05:00:49 +01:00
6c47ea921b performance: compile regex only once 2023-05-01 04:22:10 +01:00
6c0e5cfe93 Add cursor:pointer for button and select 2023-04-29 21:16:02 +03:00
0c591149d5 Add option to disable all indexing. 2023-04-26 12:52:12 +01:00
8b4b2dd268 Ignore idea files. 2023-04-26 12:52:00 +01:00
ac58bb532a Merge pull request #787 from libreddit/clippy_refactor 2023-04-19 13:08:44 -04:00
af8fe176ea Fix clippy warnings 2023-04-19 10:37:47 -04:00
bfa9c084bb Merge pull request #786 from libreddit/update_deps 2023-04-19 10:32:46 -04:00
3c892d3cfd Update Cargo.lock - h2 moderate 2023-04-19 10:27:50 -04:00
4a1b448abb Merge pull request #776 from iTzBoboCz/polls 2023-04-17 18:12:02 -04:00
991677cd1e Add variable for now_utc, format 2023-04-17 18:00:41 -04:00
3b8a13d050 Merge pull request #773 from libreddit/fmt_clippy 2023-04-15 11:01:19 -04:00
0e90ebc1a1 Merge pull request #769 from gmnsii/bypass-gate 2023-04-15 11:00:20 -04:00
af89d4c88f Merge pull request #778 from Akanksh12/comments-to-contrib-files 2023-04-15 10:59:28 -04:00
5f87875b8e Merge branch 'master' into bypass-gate 2023-04-15 10:56:28 -04:00
aaf05de1a8 Merge pull request #771 from gmnsii/comment-search 2023-04-15 10:55:10 -04:00
17f7f6a9d1 changed default port to 12345 2023-04-08 21:17:19 +05:30
ec226e0cab fix(polls): apply clippy suggestions 2023-04-08 10:41:12 +02:00
2b8931c032 Merge pull request #770 from invakid404/patch-1
fix(style): fit footer width to body size
2023-04-07 12:05:41 -04:00
62771bf4a3 Merge pull request #751 from master-hax/optimize-docker
optimize arm dockerfile
2023-04-07 12:02:03 -04:00
22e3e0eb91 added comments to libreddit.service and .conf 2023-04-06 10:06:37 +05:30
94a781c82c fix(polls): minor improvements 2023-04-01 14:31:39 +02:00
75af984154 fix(polls): apply suggestions and fix id parsing 2023-04-01 14:26:04 +02:00
8bed342a6d fix: print time suffix only for relative dates 2023-04-01 13:21:15 +02:00
de5d8d5f86 Requested code style changes 2023-03-26 11:52:02 -07:00
f465394f93 Address fmt + clippy 2023-03-25 16:32:42 -04:00
1e418619f1 Feat: search for comments within posts
Add the ability to search for specific comments within posts.
Known issues:
  - Just like on reddit, this does not work with comment sorting. The
    sorting order is ignored during the search and changing the sorting
    order after the search does not change anything. I do not think we
    can fix this before reddit does, since in my understanding we rely
    on them for the sorting. However we could implement a default
    sorting method ourselves by taking the vector of comments returned
    from the search and sorting it manually.
  - The UI could be improved on mobile. On screens with a max width
    inferior to 480 pixels, the comment search bar is displayed below
    the comment sorting form. It would be great if we could make the
    search bar have the same width as the whole comment sorting form
    but I do not have the willpower to write any more css.
2023-03-24 17:41:26 -07:00
8be69f6fe5 Checks if the link contains the parameter instead of ends with it
To know if the gate should be bypassed, we check if the link contains
the pasameter instead of checking if the link ends with it. This is
impostant, for example if we were to implement searching for comments
within a post. If we wanted to search for comments within a post that we
have bypassed the gate to view: the link will look like
https://libreddit-instance/r/somesub/comments/post-id/post-title&bypass_nsfw_landing/?q=some-query&type=comment
2023-03-23 12:36:04 -07:00
e3b1c5b587 Use a bullet instead of empty margin when score is hidden
This is prettier and keeps consistency across the app.
2023-03-23 11:29:28 -07:00
a0726c5903 Change the bypass message and format code
The bypass message now indicates that the bypass is only temporary.
2023-03-23 11:09:33 -07:00
c1c867a5ff feat: add polls 2023-03-23 13:21:09 +01:00
5dc3279ac3 fix: make time work with future dates 2023-03-23 13:18:48 +01:00
dead990ba0 fix(style): fit footer width to body size 2023-03-23 13:49:40 +02:00
e046144bf3 Allow bypassing nsfw gate for posts
On instances that are not sfw-only, the nsfw gate for posts can now be
bypassed.
2023-03-22 23:18:35 -07:00
f8ba3cf815 Fix formatting in some places 2023-03-22 20:29:48 -07:00
df3d894947 Add option to hide score
Add the option to hide score for posts and comments in preferences.
There is still however a blank margin where the score is supposed to be.
2023-03-22 20:08:20 -07:00
e25622dac2 harden docker-compose.yml (#760)
`user: nobody`: the least privileged account.
`read_only: true`: this container doesn't write anything to the filesystem, this removes a vector.
`security_opt`: disallows the container to grab more privileges.
`cap_drop`: this container doesn't need any capabilities, drop them.
`networks`: put `libreddit` into its own network so it cannot see other containers by default.
2023-03-17 10:17:01 -06:00
6bcc4aa368 Update version string in Cargo.lock. 2023-03-17 09:36:52 -06:00
eb0928acc3 add link to reddit status page. 2023-03-14 22:21:41 +01:00
6d652fc38c optimize arm dockerfile 2023-03-12 23:36:25 -07:00
f62f7bf200 v0.30.1 2023-03-10 21:34:42 -07:00
aece392a86 Pad bottom of body to prevent footer collision (fixes #747) 2023-03-10 21:33:45 -07:00
aeeb066e47 Update README.md (#748)
* Remove duplicated config

Was accidentally introduced in  412ce8f1f3
2023-03-10 21:04:05 -07:00
741613e27f Ignore errors while fetching subreddit names in subscriptions_filters()
If we can't retrieve subreddit name, just use the user-supplied name.
This fixes banned subreddits being impossible to to unfilter or
unsubscribe from.
A drawback of such approach is that it might be possible to subscribe to
a subreddit twice with different casing, however the chance of this is
extremely low.
2023-03-09 15:18:03 +03:00
51cdf574f7 v0.30.0 2023-03-08 22:15:31 -07:00
af6722c053 Move unimportant links to footer (#728) 2023-03-08 22:14:43 -07:00
412ce8f1f3 Fix default subscriptions (#732)
Co-authored-by: Daniel Valentine <daniel@vielle.ws>
2023-03-08 21:53:23 -07:00
dfa57c890d fix build error on windows (#741) 2023-03-08 21:32:41 -07:00
01f9907aaf show the count of 'more replies'. (#740)
Co-authored-by: Daniel Valentine <daniel@vielle.ws>
Co-authored-by: Matthew Esposito <matt@matthew.science>
2023-03-08 21:30:41 -07:00
bf19ff513f add support for gifs in galleries. (#744) 2023-03-08 21:04:26 -07:00
ffc9ca2e98 use the documented LIBREDDIT_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION config option. (#737) 2023-03-04 13:04:40 -07:00
a7f59ccac1 Display previews and inline images for reddit-hosted images.
This version does not change the svg behaviour in the other cases in an attempt to reduce breakage.
2023-03-03 13:50:05 +01:00
cef9266648 Restructure section on Libreddit user privacy. 2023-02-26 03:35:36 -07:00
d3b4f4e379 Update tempfile to v3.4.0. 2023-02-26 03:11:17 -07:00
b90b41c009 v0.29.4 2023-02-26 03:01:35 -07:00
pin
0eccb9bcf2 Add NetBSD install (#720) 2023-02-26 01:13:56 -07:00
eb07a2ce7c Make gated subreddits accessible by treating them as quarantined (#722)
* Fix gated communities being unviewable by treating them as quarantined

* Show restriction reason in quarantine template

* Add `gated` checks for other requests
2023-02-26 00:40:32 -07:00
0b39d4f059 Mark search query as safe on Prev/Next button (#731)
Fixes: #677 again. Complement to #686.
2023-02-26 00:35:05 -07:00
58fa213be8 Reuse hyper client. (#727)
Making a new connection on every request is very slow and wasteful, espectially on slower network.

Fix this by reuse a hyper client which shares a connection pool.

I'm able to lower /r/popular loading time from 5s to 1.5s on my machine.
2023-02-26 00:33:55 -07:00
5e03d701e4 Revert "Move unimportant links to footer"
This reverts commit e3df3a9470.
2023-02-19 18:03:55 +00:00
e3df3a9470 Move unimportant links to footer 2023-02-19 18:00:56 +00:00
35504eda14 v0.29.3 -- fix layout bugs on mobile
Addresses the following layout bugs in mobile view:

* improper rendering of award images on posts
* upvote ratio no longer appearing on bottom-right corner of post as
  before
* Reddit warning pop-up background cut off at bottom of page

Fixes #713.
2023-02-14 20:19:19 -07:00
bb5f2674d1 Merge branch 'master' into feature/fixed-navbar 2023-01-16 19:43:54 -08:00
6d49858d59 Fixed navbar: Add preference to settings restore link 2022-06-18 23:05:36 +01:00
6c202a59b0 Make the fixed navbar optional
Adds another on/off preference (default: on, keeps same
behaviour) for the fixed navbar.
When off the navbar will not remain at the top of the
page when scrolling.
This is useful for small displays such as phones where
otherwise the navbar takes up a sizeable portion of
the viewport.
2022-06-18 22:53:30 +01:00
c8805f1078 Add opensearch support 2021-04-06 01:23:14 +02:00
61 changed files with 3056 additions and 1240 deletions

View File

@ -6,7 +6,7 @@
},
"portsAttributes": {
"8080": {
"label": "libreddit",
"label": "redlib",
"onAutoForward": "notify"
}
},

View File

@ -1,6 +1,6 @@
---
name: ✨ Feature parity
about: Suggest implementing a feature into Libreddit that is found in Reddit.com
about: Suggest implementing a feature into Redlib that is found in Reddit.com
title: '✨ Feature parity: '
labels: feature parity
assignees: ''
@ -12,7 +12,7 @@ assignees: ''
A clear and concise description of what the feature is.
-->
## Describe how this could be implemented into Libreddit
## Describe how this could be implemented into Redlib
<!--
A clear and concise description of what you want to happen.
-->

View File

@ -1,6 +1,6 @@
---
name: 💡 Feature request
about: Suggest a feature for Libreddit that is not found in Reddit
about: Suggest a feature for Redlib that is not found in Reddit
title: '💡 Feature request: '
labels: enhancement
assignees: ''

View File

@ -1,38 +0,0 @@
name: Docker ARM Build
on:
push:
paths-ignore:
- "**.md"
branches:
- master
jobs:
build-docker:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
with:
version: latest
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./Dockerfile.arm
platforms: linux/arm64
push: true
tags: libreddit/libreddit:arm
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -1,41 +0,0 @@
name: Docker ARM V7 Build
on:
push:
paths-ignore:
- "**.md"
branches:
- master
jobs:
build-docker:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up QEMU
id: qemu
uses: docker/setup-qemu-action@v1
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
with:
version: latest
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
id: build_push
uses: docker/build-push-action@v2
with:
context: .
file: ./Dockerfile.armv7
platforms: linux/arm/v7
push: true
tags: libreddit/libreddit:armv7
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -1,44 +0,0 @@
name: Docker amd64 Build
on:
push:
paths-ignore:
- "**.md"
branches:
- master
jobs:
build-docker:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
with:
version: latest
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Docker Hub Description
uses: peter-evans/dockerhub-description@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
repository: libreddit/libreddit
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./Dockerfile
platforms: linux/amd64
push: true
tags: libreddit/libreddit:latest
cache-from: type=gha
cache-to: type=gha,mode=max

59
.github/workflows/main-docker.yml vendored Normal file
View File

@ -0,0 +1,59 @@
name: Docker Build
on:
push:
paths-ignore:
- "**.md"
branches:
- 'main'
jobs:
build-docker:
runs-on: ubuntu-latest
strategy:
matrix:
config:
- { platform: 'linux/amd64', tag: 'latest', dockerfile: 'Dockerfile' }
- { platform: 'linux/arm64', tag: 'latest-arm', dockerfile: 'Dockerfile.arm' }
- { platform: 'linux/arm/v7', tag: 'latest-armv7', dockerfile: 'Dockerfile.armv7' }
steps:
- name: Checkout sources
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v2
with:
version: latest
- name: Login to Quay.io
uses: docker/login-action@v3
with:
registry: quay.io
username: ${{ secrets.QUAY_USERNAME }}
password: ${{ secrets.QUAY_ROBOT_TOKEN }}
- name: push README to Quay.io
uses: christian-korneck/update-container-description-action@v1
env:
DOCKER_APIKEY: ${{ secrets.APIKEY__QUAY_IO }}
with:
destination_container_repo: quay.io/redlib/redlib
provider: quay
readme_file: 'README.md'
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: ./${{ matrix.config.dockerfile }}
platforms: ${{ matrix.config.platform }}
push: true
tags: quay.io/redlib/redlib:${{ matrix.config.tag }}
cache-from: type=gha
cache-to: type=gha,mode=max

78
.github/workflows/main-rust.yml vendored Normal file
View File

@ -0,0 +1,78 @@
name: Rust Build & Publish
on:
push:
paths-ignore:
- "**.md"
branches:
- 'main'
release:
types: [published]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v3
- name: Cache Packages
uses: Swatinem/rust-cache@v2
- name: Install stable toolchain
uses: dtolnay/rust-toolchain@stable
with:
toolchain: stable
# Building actions
- name: Build
run: RUSTFLAGS='-C target-feature=+crt-static' cargo build --release --target x86_64-unknown-linux-gnu
- name: Versions
id: version
run: echo "VERSION=$(cargo metadata --format-version 1 --no-deps | jq .packages[0].version -r | sed 's/^/v/')" >> "$GITHUB_OUTPUT"
# Publishing actions
- name: Publish to crates.io
if: github.event_name == 'release'
run: cargo publish --no-verify --token ${{ secrets.CARGO_REGISTRY_TOKEN }}
- name: Calculate SHA512 checksum
run: sha512sum target/x86_64-unknown-linux-gnu/release/redlib > redlib.sha512
- name: Calculate SHA256 checksum
run: sha256sum target/x86_64-unknown-linux-gnu/release/redlib > redlib.sha256
- uses: actions/upload-artifact@v3
name: Upload a Build Artifact
with:
name: redlib
path: |
target/x86_64-unknown-linux-gnu/release/redlib
redlib.sha512
redlib.sha256
- name: Release
uses: softprops/action-gh-release@v1
if: github.base_ref != 'main' && github.event_name == 'release'
with:
tag_name: ${{ steps.version.outputs.VERSION }}
name: ${{ steps.version.outputs.VERSION }} - ${{ github.event.head_commit.message }}
draft: true
files: |
target/x86_64-unknown-linux-gnu/release/redlib
redlib.sha512
redlib.sha256
body: |
- ${{ github.event.head_commit.message }} ${{ github.sha }}
generate_release_notes: true
env:
GITHUB_TOKEN: ${{ secrets.RELEASE_TOKEN }}

67
.github/workflows/pull-request.yml vendored Normal file
View File

@ -0,0 +1,67 @@
name: Pull Request
env:
CARGO_TERM_COLOR: always
NEXTEST_RETRIES: 10
on:
push:
branches:
- 'main'
pull_request:
branches:
- 'main'
jobs:
test:
name: cargo test
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v3
- name: Install stable toolchain
uses: dtolnay/rust-toolchain@stable
with:
toolchain: stable
- name: Install cargo-nextest
uses: taiki-e/install-action@nextest
- name: Run cargo nextest
run: cargo nextest run
format:
name: cargo fmt --all -- --check
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v3
- name: Install stable toolchain with rustfmt component
uses: dtolnay/rust-toolchain@stable
with:
toolchain: stable
components: rustfmt
- name: Run cargo fmt
run: cargo fmt --all -- --check
clippy:
name: cargo clippy -- -D warnings
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout@v3
- name: Install stable toolchain with clippy component
uses: dtolnay/rust-toolchain@stable
with:
toolchain: stable
components: clippy
- name: Run cargo clippy
run: cargo clippy -- -D warnings

View File

@ -1,22 +0,0 @@
name: Tests
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Build
run: cargo build --verbose
- name: Run tests
run: cargo test --verbose

View File

@ -1,59 +0,0 @@
name: Rust
on:
push:
paths-ignore:
- "**.md"
branches:
- master
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
- name: Cache Packages
uses: Swatinem/rust-cache@v1.0.1
- name: Build
run: cargo build --release
- name: Publish to crates.io
continue-on-error: true
run: cargo publish --no-verify --token ${{ secrets.CARGO_REGISTRY_TOKEN }}
- uses: actions/upload-artifact@v2.2.1
name: Upload a Build Artifact
with:
name: libreddit
path: target/release/libreddit
- name: Versions
id: version
run: |
echo "::set-output name=version::$(cargo metadata --format-version 1 --no-deps | jq .packages[0].version -r | sed 's/^/v/')"
echo "::set-output name=tag::$(git describe --tags)"
- name: Calculate SHA512 checksum
run: sha512sum target/release/libreddit > libreddit.sha512
- name: Release
uses: softprops/action-gh-release@v1
if: github.base_ref != 'master'
with:
tag_name: ${{ steps.version.outputs.version }}
name: ${{ steps.version.outputs.version }} - ${{ github.event.head_commit.message }}
draft: true
files: |
target/release/libreddit
libreddit.sha512
body: |
- ${{ github.event.head_commit.message }} ${{ github.sha }}
generate_release_notes: true
env:
GITHUB_TOKEN: ${{ secrets.RELEASE_TOKEN }}

5
.gitignore vendored
View File

@ -1 +1,4 @@
/target
/target
.env
# Idea Files
.idea/

View File

@ -1,2 +1,2 @@
run = "while :; do set -ex; curl -o./libreddit -fsSL -- https://github.com/libreddit/libreddit/releases/latest/download/libreddit ; chmod +x libreddit; set +e; ./libreddit -H 63115200; sleep 1; done"
run = "while :; do set -ex; nix-env -iA nixpkgs.unzip; curl -o./redlib.zip -fsSL -- https://nightly.link/redlib-org/redlib/workflows/main-rust/main/redlib.zip; unzip -n redlib.zip; mv target/x86_64-unknown-linux-gnu/release/redlib .; chmod +x redlib; set +e; ./redlib -H 63115200; sleep 1; done"
language = "bash"

View File

@ -21,6 +21,7 @@ Daniel Valentine <daniel@vielle.ws>
dbrennand <52419383+dbrennand@users.noreply.github.com>
dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Diego Magdaleno <38844659+DiegoMagdaleno@users.noreply.github.com>
domve <domve@posteo.net>
Dyras <jevwmguf@duck.com>
Edward <101938856+EdwardLangdon@users.noreply.github.com>
elliot <75391956+ellieeet123@users.noreply.github.com>
@ -58,9 +59,11 @@ Nicholas Christopher <nchristopher@tuta.io>
Nick Lowery <ClockVapor@users.noreply.github.com>
Nico <github@dr460nf1r3.org>
NKIPSC <15067635+NKIPSC@users.noreply.github.com>
o69mar <119129086+o69mar@users.noreply.github.com>
obeho <71698631+obeho@users.noreply.github.com>
obscurity <z@x4.pm>
Om G <34579088+OxyMagnesium@users.noreply.github.com>
pin <90570748+0323pin@users.noreply.github.com>
potatoesAreGod <118043038+potatoesAreGod@users.noreply.github.com>
RiversideRocks <59586759+RiversideRocks@users.noreply.github.com>
robin <8597693+robrobinbin@users.noreply.github.com>
@ -88,5 +91,6 @@ Tsvetomir Bonev <invakid404@riseup.net>
Vladislav Nepogodin <nepogodin.vlad@gmail.com>
Walkx <walkxnl@gmail.com>
Wichai <1482605+Chengings@users.noreply.github.com>
wsy2220 <wsy@dogben.com>
xatier <xatierlike@gmail.com>
Zach <72994911+zachjmurphy@users.noreply.github.com>

1226
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,38 +1,50 @@
[package]
name = "libreddit"
name = "redlib"
description = " Alternative private front-end to Reddit"
license = "AGPL-3.0"
repository = "https://github.com/spikecodes/libreddit"
version = "0.29.2"
authors = ["spikecodes <19519553+spikecodes@users.noreply.github.com>"]
repository = "https://github.com/redlib-org/redlib"
version = "0.31.0"
authors = [
"Matthew Esposito <matt+cargo@matthew.science>",
"spikecodes <19519553+spikecodes@users.noreply.github.com>",
]
edition = "2021"
[dependencies]
askama = { version = "0.11.1", default-features = false }
cached = "0.42.0"
clap = { version = "4.1.1", default-features = false, features = ["std", "env"] }
regex = "1.7.1"
serde = { version = "1.0.152", features = ["derive"] }
cookie = "0.16.2"
futures-lite = "1.12.0"
hyper = { version = "0.14.23", features = ["full"] }
hyper-rustls = "0.23.2"
percent-encoding = "2.2.0"
cached = { version = "0.46.1", features = ["async"] }
clap = { version = "4.4.11", default-features = false, features = [
"std",
"env",
] }
regex = "1.10.2"
serde = { version = "1.0.193", features = ["derive"] }
cookie = "0.18.0"
futures-lite = "1.13.0"
hyper = { version = "0.14.28", features = ["full"] }
hyper-rustls = "0.24.2"
percent-encoding = "2.3.1"
route-recognizer = "0.3.1"
serde_json = "1.0.91"
tokio = { version = "1.24.2", features = ["full"] }
time = { version = "0.3.17", features = ["local-offset"] }
url = "2.3.1"
rust-embed = { version = "6.4.2", features = ["include-exclude"] }
libflate = "1.2.0"
brotli = { version = "3.3.4", features = ["std"] }
toml = "0.5.10"
once_cell = "1.17.0"
serde_yaml = "0.9.16"
build_html = "2.2.0"
serde_json = "1.0.108"
tokio = { version = "1.35.1", features = ["full"] }
time = { version = "0.3.31", features = ["local-offset"] }
url = "2.5.0"
rust-embed = { version = "8.1.0", features = ["include-exclude"] }
libflate = "2.0.0"
brotli = { version = "3.4.0", features = ["std"] }
toml = "0.8.8"
once_cell = "1.19.0"
serde_yaml = "0.9.29"
build_html = "2.4.0"
uuid = { version = "1.6.1", features = ["v4"] }
base64 = "0.21.5"
fastrand = "2.0.1"
log = "0.4.20"
pretty_env_logger = "0.5.0"
dotenvy = "0.15.7"
[dev-dependencies]
lipsum = "0.8.2"
lipsum = "0.9.0"
sealed_test = "1.0.0"
[profile.release]

View File

@ -5,7 +5,7 @@ FROM rust:alpine AS builder
RUN apk add --no-cache musl-dev
WORKDIR /libreddit
WORKDIR /redlib
COPY . .
@ -21,16 +21,16 @@ COPY --from=builder /usr/share/ca-certificates /usr/share/ca-certificates
COPY --from=builder /etc/ssl/certs /etc/ssl/certs
# Copy our build
COPY --from=builder /libreddit/target/x86_64-unknown-linux-musl/release/libreddit /usr/local/bin/libreddit
COPY --from=builder /redlib/target/x86_64-unknown-linux-musl/release/redlib /usr/local/bin/redlib
# Use an unprivileged user.
RUN adduser --home /nonexistent --no-create-home --disabled-password libreddit
USER libreddit
RUN adduser --home /nonexistent --no-create-home --disabled-password redlib
USER redlib
# Tell Docker to expose port 8080
EXPOSE 8080
# Run a healthcheck every minute to make sure Libreddit is functional
# Run a healthcheck every minute to make sure redlib is functional
HEALTHCHECK --interval=1m --timeout=3s CMD wget --spider --q http://localhost:8080/settings || exit 1
CMD ["libreddit"]
CMD ["redlib"]

View File

@ -5,7 +5,11 @@ FROM rust:alpine AS builder
RUN apk add --no-cache g++ git
WORKDIR /usr/src/libreddit
WORKDIR /usr/src/redlib
# cache dependencies in their own layer
COPY Cargo.lock Cargo.toml .
RUN mkdir src && echo "fn main() {}" > src/main.rs && cargo install --config net.git-fetch-with-cli=true --path . && rm -rf ./src
COPY . .
@ -26,16 +30,16 @@ COPY --from=builder /usr/share/ca-certificates /usr/share/ca-certificates
COPY --from=builder /etc/ssl/certs /etc/ssl/certs
# Copy our build
COPY --from=builder /usr/local/cargo/bin/libreddit /usr/local/bin/libreddit
COPY --from=builder /usr/local/cargo/bin/redlib /usr/local/bin/redlib
# Use an unprivileged user.
RUN adduser --home /nonexistent --no-create-home --disabled-password libreddit
USER libreddit
RUN adduser --home /nonexistent --no-create-home --disabled-password redlib
USER redlib
# Tell Docker to expose port 8080
EXPOSE 8080
# Run a healthcheck every minute to make sure Libreddit is functional
# Run a healthcheck every minute to make sure redlib is functional
HEALTHCHECK --interval=1m --timeout=3s CMD wget --spider --q http://localhost:8080/settings || exit 1
CMD ["libreddit"]
CMD ["redlib"]

View File

@ -12,7 +12,7 @@ RUN apt-get update && apt-get -y install gcc-arm-linux-gnueabihf \
RUN rustup target add armv7-unknown-linux-musleabihf
WORKDIR /libreddit
WORKDIR /redlib
COPY . .
@ -28,16 +28,16 @@ COPY --from=builder /usr/share/ca-certificates /usr/share/ca-certificates
COPY --from=builder /etc/ssl/certs /etc/ssl/certs
# Copy our build
COPY --from=builder /libreddit/target/armv7-unknown-linux-musleabihf/release/libreddit /usr/local/bin/libreddit
COPY --from=builder /redlib/target/armv7-unknown-linux-musleabihf/release/redlib /usr/local/bin/redlib
# Use an unprivileged user.
RUN adduser --home /nonexistent --no-create-home --disabled-password libreddit
USER libreddit
RUN adduser --home /nonexistent --no-create-home --disabled-password redlib
USER redlib
# Tell Docker to expose port 8080
EXPOSE 8080
# Run a healthcheck every minute to make sure Libreddit is functional
# Run a healthcheck every minute to make sure redlib is functional
HEALTHCHECK --interval=1m --timeout=3s CMD wget --spider --q http://localhost:8080/settings || exit 1
CMD ["libreddit"]
CMD ["redlib"]

182
README.md
View File

@ -1,12 +1,15 @@
# Libreddit
# Redlib
> An alternative private front-end to Reddit
# ⚠️ Why do I get TOO MANY REQUESTS errors? ⚠️
#### As of July 12th, 2023, Redlib is currently not operational as Reddit's API changes, that were designed to kill third-party apps and content scrapers who don't pay [large fees](https://www.theverge.com/2023/5/31/23743993/reddit-apollo-client-api-cost), went into effect. [Read the full announcement here.](https://github.com/libreddit/libreddit/issues/840)
![screenshot](https://i.ibb.co/QYbqTQt/libreddit-rust.png)
---
**10 second pitch:** Libreddit is a portmanteau of "libre" (meaning freedom) and "Reddit". It is a private front-end like [Invidious](https://github.com/iv-org/invidious) but for Reddit. Browse the coldest takes of [r/unpopularopinion](https://libreddit.spike.codes/r/unpopularopinion) without being [tracked](#reddit).
**10-second pitch:** Redlib is a private front-end like [Invidious](https://github.com/iv-org/invidious) but for Reddit. Browse the coldest takes of [r/unpopularopinion](https://libreddit.spike.codes/r/unpopularopinion) without being [tracked](#reddit).
- 🚀 Fast: written in Rust for blazing-fast speeds and memory safety
- ☁️ Light: no JavaScript, no ads, no tracking, no bloat
@ -15,7 +18,7 @@
---
I appreciate any donations! Your support allows me to continue developing Libreddit.
I appreciate any donations! Your support allows me to continue developing Redlib.
<a href="https://www.buymeacoffee.com/spikecodes" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 40px" ></a>
<a href="https://liberapay.com/spike/donate"><img alt="Donate using Liberapay" src="https://liberapay.com/assets/widgets/donate.svg" style="height: 40px"></a>
@ -29,51 +32,51 @@ I appreciate any donations! Your support allows me to continue developing Libred
# Instances
🔗 **Want to automatically redirect Reddit links to Libreddit? Use [LibRedirect](https://github.com/libredirect/libredirect) or [Privacy Redirect](https://github.com/SimonBrazell/privacy-redirect)!**
🔗 **Want to automatically redirect Reddit links to Redlib? Use [LibRedirect](https://github.com/libredirect/libredirect) or [Privacy Redirect](https://github.com/SimonBrazell/privacy-redirect)!**
[Follow this link](https://github.com/libreddit/libreddit-instances/blob/master/instances.md) for an up-to-date table of instances in markdown format. This list is also available as [a machine-readable JSON](https://github.com/libreddit/libreddit-instances/blob/master/instances.json).
[Follow this link](https://github.com/redlib-org/redlib-instances/blob/main/instances.md) for an up-to-date table of instances in Markdown format. This list is also available as [a machine-readable JSON](https://github.com/redlib-org/redlib-instances/blob/main/instances.json).
Both files are part of the [libreddit-instances](https://github.com/libreddit/libreddit-instances) repository. To contribute your [self-hosted instance](#deployment) to the list, see the [libreddit-instances README](https://github.com/libreddit/libreddit-instances/blob/master/README.md).
Both files are part of the [libreddit-instances](https://github.com/redlib-org/redlib-instances) repository. To contribute your [self-hosted instance](#deployment) to the list, see the [libreddit-instances README](https://github.com/redlib-org/redlib-instances/blob/main/README.md).
---
# About
Find Libreddit on 💬 [Matrix](https://matrix.to/#/#libreddit:kde.org), 🐋 [Docker](https://hub.docker.com/r/libreddit/libreddit), :octocat: [GitHub](https://github.com/libreddit/libreddit), and 🦊 [GitLab](https://gitlab.com/libreddit/libreddit).
Find Redlib on 💬 [Matrix](https://matrix.to/#/#redlib:matrix.org), 🐋 [Quay.io](https://quay.io/repository/redlib/redlib), :octocat: [GitHub](https://github.com/redlib-org/redlib), and 🦊 [GitLab](https://gitlab.com/redlib/redlib).
## Built with
- [Rust](https://www.rust-lang.org/) - Programming language
- [Hyper](https://github.com/hyperium/hyper) - HTTP server and client
- [Askama](https://github.com/djc/askama) - Templating engine
- [Rustls](https://github.com/ctz/rustls) - TLS library
- [Rustls](https://github.com/rustls/rustls) - TLS library
## Info
Libreddit hopes to provide an easier way to browse Reddit, without the ads, trackers, and bloat. Libreddit was inspired by other alternative front-ends to popular services such as [Invidious](https://github.com/iv-org/invidious) for YouTube, [Nitter](https://github.com/zedeus/nitter) for Twitter, and [Bibliogram](https://sr.ht/~cadence/bibliogram/) for Instagram.
Redlib hopes to provide an easier way to browse Reddit, without the ads, trackers, and bloat. Redlib was inspired by other alternative front-ends to popular services such as [Invidious](https://github.com/iv-org/invidious) for YouTube, [Nitter](https://github.com/zedeus/nitter) for Twitter, and [Bibliogram](https://sr.ht/~cadence/bibliogram/) for Instagram.
Libreddit currently implements most of Reddit's (signed-out) functionalities but still lacks [a few features](https://github.com/libreddit/libreddit/issues).
Redlib currently implements most of Reddit's (signed-out) functionalities but still lacks [a few features](https://github.com/libreddit/libreddit/issues).
## How does it compare to Teddit?
Teddit is another awesome open source project designed to provide an alternative frontend to Reddit. There is no connection between the two and you're welcome to use whichever one you favor. Competition fosters innovation and Teddit's release has motivated me to build Libreddit into an even more polished product.
Teddit is another awesome open source project designed to provide an alternative frontend to Reddit. There is no connection between the two, and you're welcome to use whichever one you favor. Competition fosters innovation and Teddit's release has motivated me to build Redlib into an even more polished product.
If you are looking to compare, the biggest differences I have noticed are:
- Libreddit is themed around Reddit's redesign whereas Teddit appears to stick much closer to Reddit's old design. This may suit some users better as design is always subjective.
- Libreddit is written in [Rust](https://www.rust-lang.org) for speed and memory safety. It uses [Hyper](https://hyper.rs), a speedy and lightweight HTTP server/client implementation.
- Redlib is themed around Reddit's redesign whereas Teddit appears to stick much closer to Reddit's old design. This may suit some users better as design is always subjective.
- Redlib is written in [Rust](https://www.rust-lang.org) for speed and memory safety. It uses [Hyper](https://hyper.rs), a speedy and lightweight HTTP server/client implementation.
---
# Comparison
This section outlines how Libreddit compares to Reddit.
This section outlines how Redlib compares to Reddit.
## Speed
Lasted tested Nov 11, 2022.
Results from Google PageSpeed Insights ([Libreddit Report](https://pagespeed.web.dev/report?url=https%3A%2F%2Flibreddit.spike.codes%2F), [Reddit Report](https://pagespeed.web.dev/report?url=https://www.reddit.com)).
Results from Google PageSpeed Insights ([Redlib Report](https://pagespeed.web.dev/report?url=https%3A%2F%2Flibreddit.spike.codes%2F), [Reddit Report](https://pagespeed.web.dev/report?url=https://www.reddit.com)).
| | Libreddit | Reddit |
| | Redlib | Reddit |
|------------------------|-------------|-----------|
| Requests | 60 | 83 |
| Speed Index | 2.0s | 10.4s |
@ -110,17 +113,25 @@ Results from Google PageSpeed Insights ([Libreddit Report](https://pagespeed.web
- Third-Party Cookies
- Third-Party Site
### Libreddit
### Redlib
For transparency, I hope to describe all the ways Libreddit handles user privacy.
For transparency, I hope to describe all the ways Redlib handles user privacy.
**Logging:** In production (when running the binary, hosting with docker, or using the official instances), Libreddit logs nothing. When debugging (running from source without `--release`), Libreddit logs post IDs fetched to aid with troubleshooting.
#### Server
**DNS:** Both official domains (`libredd.it` and `libreddit.spike.codes`) use Cloudflare as the DNS resolver. Though, the sites are not proxied through Cloudflare meaning Cloudflare doesn't have access to user traffic.
* **Logging:** In production (when running the binary, hosting with docker, or using the official instances), Redlib logs nothing. When debugging (running from source without `--release`), Redlib logs post IDs fetched to aid with troubleshooting.
**Cookies:** Libreddit uses optional cookies to store any configured settings in [the settings menu](https://libreddit.spike.codes/settings). These are not cross-site cookies and the cookies hold no personal data.
* **Cookies:** Redlib uses optional cookies to store any configured settings in [the settings menu](https://libreddit.spike.codes/settings). These are not cross-site cookies and the cookies hold no personal data.
**Hosting:** The official instances are hosted on [Replit](https://replit.com/) which monitors usage to prevent abuse. I can understand if this invalidates certain users' threat models and therefore, self-hosting, using unofficial instances, and browsing through Tor are welcomed.
#### Official instance (libreddit.spike.codes)
The official instance is hosted at https://libreddit.spike.codes.
* **Server:** The official instance runs a production binary, and thus logs nothing.
* **DNS:** The domain for the official instance uses Cloudflare as the DNS resolver. However, this site is not proxied through Cloudflare, and thus Cloudflare doesn't have access to user traffic.
* **Hosting:** The official instance is hosted on [Replit](https://replit.com/), which monitors usage to prevent abuse. I can understand if this invalidates certain users' threat models, and therefore, self-hosting, using unofficial instances, and browsing through Tor are welcomed.
---
@ -136,103 +147,122 @@ cargo install libreddit
## 2) Docker
Deploy the [Docker image](https://hub.docker.com/r/libreddit/libreddit) of Libreddit:
Deploy the [Docker image](https://quay.io/repository/redlib/redlib) of Redlib:
```
docker pull libreddit/libreddit
docker run -d --name libreddit -p 8080:8080 libreddit/libreddit
docker pull quay.io/redlib/redlib
docker run -d --name redlib -p 8080:8080 quay.io/redlib/redlib
```
Deploy using a different port (in this case, port 80):
```
docker pull libreddit/libreddit
docker run -d --name libreddit -p 80:8080 libreddit/libreddit
docker pull quay.io/redlib/redlib
docker run -d --name redlib -p 80:8080 quay.io/redlib/redlib
```
To deploy on `arm64` platforms, simply replace `libreddit/libreddit` in the commands above with `libreddit/libreddit:arm`.
To deploy on `arm64` platforms, simply replace `quay.io/redlib/redlib` in the commands above with `quay.io/redlib/redlib:latest-arm`.
To deploy on `armv7` platforms, simply replace `libreddit/libreddit` in the commands above with `libreddit/libreddit:armv7`.
To deploy on `armv7` platforms, simply replace `quay.io/redlib/redlib` in the commands above with `quay.io/redlib/redlib:latest-armv7`.
## 3) AUR
For ArchLinux users, Libreddit is available from the AUR as [`libreddit-git`](https://aur.archlinux.org/packages/libreddit-git).
For ArchLinux users, Redlib is available from the AUR as [`libreddit-git`](https://aur.archlinux.org/packages/libreddit-git).
```
yay -S libreddit-git
```
## 4) NetBSD/pkgsrc
## 4) GitHub Releases
For NetBSD users, Redlib is available from the official repositories.
If you're on Linux and none of these methods work for you, you can grab a Linux binary from [the newest release](https://github.com/libreddit/libreddit/releases/latest).
```
pkgin install libreddit
```
## 5) Replit/Heroku/Glitch
Or, if you prefer to build from source
```
cd /usr/pkgsrc/libreddit
make install
```
## 5) GitHub Releases
If you're on Linux and none of these methods work for you, you can grab a Linux binary from [the newest release](https://github.com/redlib-org/redlib/releases/latest).
## 6) Replit/Heroku/Glitch
> **Warning**
> These are free hosting options but they are *not* private and will monitor server usage to prevent abuse. If you need a free and easy setup, this method may work best for you.
> These are free hosting options, but they are *not* private and will monitor server usage to prevent abuse. If you need a free and easy setup, this method may work best for you.
<a href="https://repl.it/github/libreddit/libreddit"><img src="https://repl.it/badge/github/libreddit/libreddit" alt="Run on Repl.it" height="32" /></a>
[![Deploy](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy?template=https://github.com/libreddit/libreddit)
<a href="https://repl.it/github/redlib-org/redlib"><img src="https://repl.it/badge/github/redlib-org/redlib" alt="Run on Repl.it" height="32" /></a>
[![Deploy](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy?template=https://github.com/redlib-org/redlib)
[![Remix on Glitch](https://cdn.glitch.com/2703baf2-b643-4da7-ab91-7ee2a2d00b5b%2Fremix-button-v2.svg)](https://glitch.com/edit/#!/remix/libreddit)
---
# Deployment
Once installed, deploy Libreddit to `0.0.0.0:8080` by running:
Once installed, deploy Redlib to `0.0.0.0:8080` by running:
```
libreddit
redlib
```
## Instance settings
Assign a default value for each instance-specific setting by passing environment variables to Libreddit in the format `LIBREDDIT_{X}`. Replace `{X}` with the setting name (see list below) in capital letters.
Assign a default value for each instance-specific setting by passing environment variables to Redlib in the format `REDLIB_{X}`. Replace `{X}` with the setting name (see list below) in capital letters.
|Name|Possible values|Default value|Description|
|-|-|-|-|
| `SFW_ONLY` | `["on", "off"]` | `off` | Enables SFW-only mode for the instance, i.e. all NSFW content is filtered. |
| `BANNER` | String | (empty) | Allows the server to set a banner to be displayed. Currently this is displayed on the instance info page. |
| Name | Possible values | Default value | Description |
|---------------------------|-----------------|------------------|-----------------------------------------------------------------------------------------------------------|
| `SFW_ONLY` | `["on", "off"]` | `off` | Enables SFW-only mode for the instance, i.e. all NSFW content is filtered. |
| `BANNER` | String | (empty) | Allows the server to set a banner to be displayed. Currently this is displayed on the instance info page. |
| `ROBOTS_DISABLE_INDEXING` | `["on", "off"]` | `off` | Disables indexing of the instance by search engines. |
| `PUSHSHIFT_FRONTEND` | String | `www.unddit.com` | Allows the server to set the Pushshift frontend to be used with "removed" links. |
## Default User Settings
Assign a default value for each user-modifiable setting by passing environment variables to Libreddit in the format `LIBREDDIT_DEFAULT_{Y}`. Replace `{Y}` with the setting name (see list below) in capital letters.
Assign a default value for each user-modifiable setting by passing environment variables to Redlib in the format `REDLIB_DEFAULT_{Y}`. Replace `{Y}` with the setting name (see list below) in capital letters.
| Name | Possible values | Default value |
|-------------------------|-----------------------------------------------------------------------------------------------------|---------------|
| `THEME` | `["system", "light", "dark", "black", "dracula", "nord", "laserwave", "violet", "gold", "rosebox", "gruvboxdark", "gruvboxlight"]` | `system` |
| `FRONT_PAGE` | `["default", "popular", "all"]` | `default` |
| `LAYOUT` | `["card", "clean", "compact"]` | `card` |
| `WIDE` | `["on", "off"]` | `off` |
| `POST_SORT` | `["hot", "new", "top", "rising", "controversial"]` | `hot` |
| `COMMENT_SORT` | `["confidence", "top", "new", "controversial", "old"]` | `confidence` |
| `SHOW_NSFW` | `["on", "off"]` | `off` |
| `BLUR_NSFW` | `["on", "off"]` | `off` |
| `USE_HLS` | `["on", "off"]` | `off` |
| `HIDE_HLS_NOTIFICATION` | `["on", "off"]` | `off` |
| `AUTOPLAY_VIDEOS` | `["on", "off"]` | `off` |
| `HIDE_AWARDS` | `["on", "off"]` | `off`
| `DISABLE_VISIT_REDDIT_CONFIRMATION` | `["on", "off"]` | `off` |
| Name | Possible values | Default value |
|-------------------------------------|------------------------------------------------------------------------------------------------------------------------------------|---------------|
| `THEME` | `["system", "light", "dark", "black", "dracula", "nord", "laserwave", "violet", "gold", "rosebox", "gruvboxdark", "gruvboxlight"]` | `system` |
| `FRONT_PAGE` | `["default", "popular", "all"]` | `default` |
| `LAYOUT` | `["card", "clean", "compact"]` | `card` |
| `WIDE` | `["on", "off"]` | `off` |
| `POST_SORT` | `["hot", "new", "top", "rising", "controversial"]` | `hot` |
| `COMMENT_SORT` | `["confidence", "top", "new", "controversial", "old"]` | `confidence` |
| `SHOW_NSFW` | `["on", "off"]` | `off` |
| `BLUR_NSFW` | `["on", "off"]` | `off` |
| `USE_HLS` | `["on", "off"]` | `off` |
| `HIDE_HLS_NOTIFICATION` | `["on", "off"]` | `off` |
| `AUTOPLAY_VIDEOS` | `["on", "off"]` | `off` |
| `SUBSCRIPTIONS` | `+`-delimited list of subreddits (`sub1+sub2+sub3+...`) | _(none)_ |
| `HIDE_AWARDS` | `["on", "off"]` | `off` |
| `DISABLE_VISIT_REDDIT_CONFIRMATION` | `["on", "off"]` | `off` |
| `HIDE_SCORE` | `["on", "off"]` | `off` |
| `FIXED_NAVBAR` | `["on", "off"]` | `on` |
You can also configure Libreddit with a configuration file. An example `libreddit.toml` can be found below:
You can also configure Redlib with a configuration file. An example `redlib.toml` can be found below:
```toml
LIBREDDIT_DEFAULT_WIDE = "on"
LIBREDDIT_DEFAULT_USE_HLS = "on"
REDLIB_DEFAULT_WIDE = "on"
REDLIB_DEFAULT_USE_HLS = "on"
```
### Examples
```bash
LIBREDDIT_DEFAULT_SHOW_NSFW=on libreddit
REDLIB_DEFAULT_SHOW_NSFW=on redlib
```
```bash
LIBREDDIT_DEFAULT_WIDE=on LIBREDDIT_DEFAULT_THEME=dark libreddit -r
REDLIB_DEFAULT_WIDE=on REDLIB_DEFAULT_THEME=dark redlib -r
```
## Proxying using NGINX
> **Note**
> If you're [proxying Libreddit through an NGINX Reverse Proxy](https://github.com/libreddit/libreddit/issues/122#issuecomment-782226853), add
> If you're [proxying Redlib through an NGINX Reverse Proxy](https://github.com/libreddit/libreddit/issues/122#issuecomment-782226853), add
> ```nginx
> proxy_http_version 1.1;
> ```
@ -240,27 +270,35 @@ LIBREDDIT_DEFAULT_WIDE=on LIBREDDIT_DEFAULT_THEME=dark libreddit -r
## systemd
You can use the systemd service available in `contrib/libreddit.service`
(install it on `/etc/systemd/system/libreddit.service`).
You can use the systemd service available in `contrib/redlib.service`
(install it on `/etc/systemd/system/redlib.service`).
That service can be optionally configured in terms of environment variables by
creating a file in `/etc/libreddit.conf`. Use the `contrib/libreddit.conf` as a
template. You can also add the `LIBREDDIT_DEFAULT__{X}` settings explained
creating a file in `/etc/redlib.conf`. Use the `contrib/redlib.conf` as a
template. You can also add the `REDLIB_DEFAULT__{X}` settings explained
above.
When "Proxying using NGINX" where the proxy is on the same machine, you should
guarantee nginx waits for this service to start. Edit
`/etc/systemd/system/libreddit.service.d/reverse-proxy.conf`:
`/etc/systemd/system/redlib.service.d/reverse-proxy.conf`:
```conf
[Unit]
Before=nginx.service
```
## launchd
If you are on macOS, you can use the launchd service available in `contrib/redlib.plist`.
Install it with `cp contrib/redlib.plist ~/Library/LaunchAgents/`.
Load and start it with `launchctl load ~/Library/LaunchAgents/redlib.plist`.
## Building
```
git clone https://github.com/libreddit/libreddit
cd libreddit
git clone https://github.com/redlib-org/redlib
cd redlib
cargo run
```

View File

@ -1,5 +1,5 @@
{
"name": "Libreddit",
"name": "Redlib",
"description": "Private front-end for Reddit",
"buildpacks": [
{
@ -11,47 +11,59 @@
],
"stack": "container",
"env": {
"LIBREDDIT_DEFAULT_THEME": {
"REDLIB_DEFAULT_THEME": {
"required": false
},
"LIBREDDIT_DEFAULT_FRONT_PAGE": {
"REDLIB_DEFAULT_FRONT_PAGE": {
"required": false
},
"LIBREDDIT_DEFAULT_LAYOUT": {
"REDLIB_DEFAULT_LAYOUT": {
"required": false
},
"LIBREDDIT_DEFAULT_WIDE": {
"REDLIB_DEFAULT_WIDE": {
"required": false
},
"LIBREDDIT_DEFAULT_COMMENT_SORT": {
"REDLIB_DEFAULT_COMMENT_SORT": {
"required": false
},
"LIBREDDIT_DEFAULT_POST_SORT": {
"REDLIB_DEFAULT_POST_SORT": {
"required": false
},
"LIBREDDIT_DEFAULT_SHOW_NSFW": {
"REDLIB_DEFAULT_SHOW_NSFW": {
"required": false
},
"LIBREDDIT_DEFAULT_BLUR_NSFW": {
"REDLIB_DEFAULT_BLUR_NSFW": {
"required": false
},
"LIBREDDIT_USE_HLS": {
"REDLIB_USE_HLS": {
"required": false
},
"LIBREDDIT_HIDE_HLS_NOTIFICATION": {
"REDLIB_HIDE_HLS_NOTIFICATION": {
"required": false
},
"LIBREDDIT_SFW_ONLY": {
"REDLIB_SFW_ONLY": {
"required": false
},
"LIBREDDIT_DEFAULT_HIDE_AWARDS": {
"REDLIB_DEFAULT_HIDE_AWARDS": {
"required": false
},
"LIBREDDIT_BANNER": {
"REDLIB_DEFAULT_HIDE_SCORE": {
"required": false
},
"LIBREDDIT_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION": {
"REDLIB_BANNER": {
"required": false
}
},
"REDLIB_ROBOTS_DISABLE_INDEXING": {
"required": false
},
"REDLIB_DEFAULT_SUBSCRIPTIONS": {
"required": false
},
"REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION": {
"required": false
},
"REDLIB_PUSHSHIFT_FRONTEND": {
"required": false
},
}
}

View File

@ -1,8 +1,13 @@
use std::{
os::unix::process::ExitStatusExt,
process::{Command, ExitStatus, Output},
};
use std::process::{Command, ExitStatus, Output};
#[cfg(not(target_os = "windows"))]
use std::os::unix::process::ExitStatusExt;
#[cfg(target_os = "windows")]
use std::os::windows::process::ExitStatusExt;
fn main() {
println!("cargo:rerun-if-changed=src/");
let output = String::from_utf8(
Command::new("git")
.args(["rev-parse", "HEAD"])

View File

@ -1,2 +0,0 @@
ADDRESS=0.0.0.0
PORT=12345

16
contrib/redlib.conf Normal file
View File

@ -0,0 +1,16 @@
ADDRESS=0.0.0.0
PORT=12345
#REDLIB_DEFAULT_THEME=default
#REDLIB_DEFAULT_FRONT_PAGE=default
#REDLIB_DEFAULT_LAYOUT=card
#REDLIB_DEFAULT_WIDE=off
#REDLIB_DEFAULT_POST_SORT=hot
#REDLIB_DEFAULT_COMMENT_SORT=confidence
#REDLIB_DEFAULT_SHOW_NSFW=off
#REDLIB_DEFAULT_BLUR_NSFW=off
#REDLIB_DEFAULT_USE_HLS=off
#REDLIB_DEFAULT_HIDE_HLS_NOTIFICATION=off
#REDLIB_DEFAULT_AUTOPLAY_VIDEOS=off
#REDLIB_DEFAULT_SUBSCRIPTIONS=off (sub1+sub2+sub3)
#REDLIB_DEFAULT_HIDE_AWARDS=off
#REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION=off

19
contrib/redlib.plist Normal file
View File

@ -0,0 +1,19 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>redlib</string>
<key>Program</key>
<string>redlib</string>
<key>KeepAlive</key>
<true/>
<key>RunAtLoad</key>
<true/>
</dict>
</plist>

View File

@ -1,15 +1,15 @@
[Unit]
Description=libreddit daemon
Description=redlib daemon
After=network.service
[Service]
DynamicUser=yes
# Default Values
Environment=ADDRESS=0.0.0.0
Environment=PORT=8080
#Environment=ADDRESS=0.0.0.0
#Environment=PORT=8080
# Optional Override
EnvironmentFile=-/etc/libreddit.conf
ExecStart=/usr/bin/libreddit -a ${ADDRESS} -p ${PORT}
EnvironmentFile=-/etc/redlib.conf
ExecStart=/usr/bin/redlib -a ${ADDRESS} -p ${PORT}
# Hardening
DeviceAllow=

View File

@ -4,10 +4,23 @@ services:
web:
build: .
restart: always
container_name: "libreddit"
container_name: "redlib"
ports:
- 8080:8080
user: nobody
read_only: true
security_opt:
- no-new-privileges:true
cap_drop:
- ALL
networks:
- redlib
security_opt:
- seccomp="seccomp-redlib.json"
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "--tries=1", "http://localhost:8080/settings"]
interval: 5m
timeout: 3s
networks:
redlib:

View File

@ -1,7 +1,7 @@
#!/usr/bin/env bash
# This scripts generates the CREDITS file in the repository root, which
# contains a list of all contributors ot the Libreddit project.
# contains a list of all contributors ot the Redlib project.
#
# We use git-log to surface the names and emails of all authors and committers,
# and grep will filter any automated commits due to GitHub.
@ -9,7 +9,7 @@
set -o pipefail
cd "$(dirname "${BASH_SOURCE[0]}")/../" || exit 1
git --no-pager log --pretty='%an <%ae>%n%cn <%ce>' master \
git --no-pager log --pretty='%an <%ae>%n%cn <%ce>' main \
| sort -t'<' -u -k1,1 -k2,2 \
| grep -Fv -- 'GitHub <noreply@github.com>' \
> CREDITS

112
scripts/update_oauth_resources.sh Executable file
View File

@ -0,0 +1,112 @@
#!/bin/bash
# Requirements
# - curl
# - rg
# - jq
# Fetch iOS app versions
ios_version_list=$(curl -s "https://ipaarchive.com/app/usa/1064216828" | rg "(20\d{2}\.\d+.\d+) / (\d+)" --only-matching -r "Version \$1/Build \$2" | sort | uniq)
# Count the number of lines in the version list
ios_app_count=$(echo "$ios_version_list" | wc -l)
echo -e "Fetching \e[34m$ios_app_count iOS app versions...\e[0m"
# Specify the filename as a variable
filename="src/oauth_resources.rs"
# Add comment that it is user generated
echo "// This file was generated by scripts/update_oauth_resources.sh" > "$filename"
echo "// Rerun scripts/update_oauth_resources.sh to update this file" >> "$filename"
echo "// Please do not edit manually" >> "$filename"
echo "// Filled in with real app versions" >> "$filename"
# Open the array in the source file
echo "pub static _IOS_APP_VERSION_LIST: &[&str; $ios_app_count] = &[" >> "$filename"
num=0
# Append the version list to the source file
echo "$ios_version_list" | while IFS= read -r line; do
num=$((num+1))
echo " \"$line\"," >> "$filename"
echo -e "[$num/$ios_app_count] Fetched \e[34m$line\e[0m."
done
# Close the array in the source file
echo "];" >> "$filename"
# Fetch Android app versions
page_1=$(curl -s "https://apkcombo.com/reddit/com.reddit.frontpage/old-versions/" | rg "<a class=\"ver-item\" href=\"(/reddit/com\.reddit\.frontpage/download/phone-20\d{2}\.\d+\.\d+-apk)\" rel=\"nofollow\">" -r "https://apkcombo.com\$1" | sort | uniq)
# Append with pages
page_2=$(curl -s "https://apkcombo.com/reddit/com.reddit.frontpage/old-versions?page=2" | rg "<a class=\"ver-item\" href=\"(/reddit/com\.reddit\.frontpage/download/phone-20\d{2}\.\d+\.\d+-apk)\" rel=\"nofollow\">" -r "https://apkcombo.com\$1" | sort | uniq)
page_3=$(curl -s "https://apkcombo.com/reddit/com.reddit.frontpage/old-versions?page=3" | rg "<a class=\"ver-item\" href=\"(/reddit/com\.reddit\.frontpage/download/phone-20\d{2}\.\d+\.\d+-apk)\" rel=\"nofollow\">" -r "https://apkcombo.com\$1" | sort | uniq)
page_4=$(curl -s "https://apkcombo.com/reddit/com.reddit.frontpage/old-versions?page=4" | rg "<a class=\"ver-item\" href=\"(/reddit/com\.reddit\.frontpage/download/phone-20\d{2}\.\d+\.\d+-apk)\" rel=\"nofollow\">" -r "https://apkcombo.com\$1" | sort | uniq)
page_5=$(curl -s "https://apkcombo.com/reddit/com.reddit.frontpage/old-versions?page=5" | rg "<a class=\"ver-item\" href=\"(/reddit/com\.reddit\.frontpage/download/phone-20\d{2}\.\d+\.\d+-apk)\" rel=\"nofollow\">" -r "https://apkcombo.com\$1" | sort | uniq)
# Concatenate all pages
versions="${page_1}"
versions+=$'\n'
versions+="${page_2}"
versions+=$'\n'
versions+="${page_3}"
versions+=$'\n'
versions+="${page_4}"
versions+=$'\n'
versions+="${page_5}"
# Count the number of lines in the version list
android_count=$(echo "$versions" | wc -l)
echo -e "Fetching \e[32m$android_count Android app versions...\e[0m"
# Append to the source file
echo "pub static ANDROID_APP_VERSION_LIST: &[&str; $android_count] = &[" >> "$filename"
num=0
# For each in versions, curl the page and extract the build number
echo "$versions" | while IFS= read -r line; do
num=$((num+1))
fetch_page=$(curl -s "$line")
build=$(echo "$fetch_page" | rg "<span class=\"vercode\">\((\d+)\)</span>" --only-matching -r "\$1" | head -n1)
version=$(echo "$fetch_page" | rg "<span class=\"vername\">Reddit (20\d{2}\.\d+\.\d+)</span>" --only-matching -r "\$1" | head -n1)
echo " \"Version $version/Build $build\"," >> "$filename"
echo -e "[$num/$android_count] Fetched \e[32mVersion $version/Build $build\e[0m."
done
# Close the array in the source file
echo "];" >> "$filename"
# Retrieve iOS versions
table=$(curl -s "https://en.wikipedia.org/w/api.php?action=parse&page=IOS_17&prop=wikitext&section=31&format=json" | jq ".parse.wikitext.\"*\"" | rg "(17\.[\d\.]*)\\\n\|(\w*)\\\n\|" --only-matching -r "Version \$1 (Build \$2)")
# Count the number of lines in the version list
ios_count=$(echo "$table" | wc -l)
echo -e "Fetching \e[34m$ios_count iOS versions...\e[0m"
# Append to the source file
echo "pub static _IOS_OS_VERSION_LIST: &[&str; $ios_count] = &[" >> "$filename"
num=0
# For each in versions, curl the page and extract the build number
echo "$table" | while IFS= read -r line; do
num=$((num+1))
echo " \"$line\"," >> "$filename"
echo -e "\e[34m[$num/$ios_count] Fetched $line\e[0m."
done
# Close the array in the source file
echo "];" >> "$filename"
echo -e "\e[34mRetrieved $ios_app_count iOS app versions.\e[0m"
echo -e "\e[32mRetrieved $android_count Android app versions.\e[0m"
echo -e "\e[34mRetrieved $ios_count iOS versions.\e[0m"
echo -e "\e[34mTotal: $((ios_app_count + android_count + ios_count))\e[0m"
echo -e "\e[32mSuccess!\e[0m"

125
seccomp-redlib.json Normal file
View File

@ -0,0 +1,125 @@
{
"defaultAction": "SCMP_ACT_ERRNO",
"archMap": [
{
"architecture": "SCMP_ARCH_X86_64",
"subArchitectures": [
"SCMP_ARCH_X86",
"SCMP_ARCH_X32"
]
},
{
"architecture": "SCMP_ARCH_AARCH64",
"subArchitectures": [
"SCMP_ARCH_ARM"
]
},
{
"architecture": "SCMP_ARCH_MIPS64",
"subArchitectures": [
"SCMP_ARCH_MIPS",
"SCMP_ARCH_MIPS64N32"
]
},
{
"architecture": "SCMP_ARCH_MIPS64N32",
"subArchitectures": [
"SCMP_ARCH_MIPS",
"SCMP_ARCH_MIPS64"
]
},
{
"architecture": "SCMP_ARCH_MIPSEL64",
"subArchitectures": [
"SCMP_ARCH_MIPSEL",
"SCMP_ARCH_MIPSEL64N32"
]
},
{
"architecture": "SCMP_ARCH_MIPSEL64N32",
"subArchitectures": [
"SCMP_ARCH_MIPSEL",
"SCMP_ARCH_MIPSEL64"
]
},
{
"architecture": "SCMP_ARCH_S390X",
"subArchitectures": [
"SCMP_ARCH_S390"
]
}
],
"syscalls": [
{
"names": [
"accept4",
"arch_prctl",
"bind",
"brk",
"clock_gettime",
"clone",
"close",
"connect",
"epoll_create1",
"epoll_ctl",
"epoll_pwait",
"eventfd2",
"execve",
"exit",
"exit_group",
"fcntl",
"flock",
"fork",
"fstat",
"futex",
"getcwd",
"getpeername",
"getpid",
"getrandom",
"getsockname",
"getsockopt",
"getgid",
"getppid",
"gettid",
"getuid",
"ioctl",
"listen",
"lseek",
"madvise",
"mmap",
"mprotect",
"mremap",
"munmap",
"newfstatat",
"open",
"openat",
"prctl",
"poll",
"read",
"recvfrom",
"rt_sigaction",
"rt_sigprocmask",
"rt_sigreturn",
"sched_getaffinity",
"sched_yield",
"sendto",
"setitimer",
"setsockopt",
"set_tid_address",
"shutdown",
"sigaltstack",
"socket",
"socketpair",
"stat",
"wait4",
"write",
"writev"
],
"action": "SCMP_ACT_ALLOW",
"args": [],
"comment": "",
"includes": {},
"excludes": {}
}
]
}

View File

@ -1,15 +1,34 @@
use cached::proc_macro::cached;
use futures_lite::future::block_on;
use futures_lite::{future::Boxed, FutureExt};
use hyper::{body, body::Buf, client, header, Body, Method, Request, Response, Uri};
use hyper::client::HttpConnector;
use hyper::{body, body::Buf, client, header, Body, Client, Method, Request, Response, Uri};
use hyper_rustls::HttpsConnector;
use libflate::gzip;
use once_cell::sync::Lazy;
use percent_encoding::{percent_encode, CONTROLS};
use serde_json::Value;
use std::{io, result::Result};
use tokio::sync::RwLock;
use crate::dbg_msg;
use crate::oauth::{token_daemon, Oauth};
use crate::server::RequestExt;
use crate::utils::format_url;
const REDDIT_URL_BASE: &str = "https://www.reddit.com";
const REDDIT_URL_BASE: &str = "https://oauth.reddit.com";
pub static CLIENT: Lazy<Client<HttpsConnector<HttpConnector>>> = Lazy::new(|| {
let https = hyper_rustls::HttpsConnectorBuilder::new().with_native_roots().https_only().enable_http1().build();
client::Client::builder().build(https)
});
pub static OAUTH_CLIENT: Lazy<RwLock<Oauth>> = Lazy::new(|| {
let client = block_on(Oauth::new());
tokio::spawn(token_daemon());
RwLock::new(client)
});
/// Gets the canonical path for a resource on Reddit. This is accomplished by
/// making a `HEAD` request to Reddit at the path given in `path`.
@ -26,28 +45,46 @@ const REDDIT_URL_BASE: &str = "https://www.reddit.com";
#[cached(size = 1024, time = 600, result = true)]
pub async fn canonical_path(path: String) -> Result<Option<String>, String> {
let res = reddit_head(path.clone(), true).await?;
let status = res.status().as_u16();
if res.status() == 429 {
return Err("Too many requests.".to_string());
};
match status {
429 => Err("Too many requests.".to_string()),
// If Reddit responds with a 2xx, then the path is already canonical.
if res.status().to_string().starts_with('2') {
return Ok(Some(path));
// If Reddit responds with a 2xx, then the path is already canonical.
200..=299 => Ok(Some(path)),
// If Reddit responds with a 301, then the path is redirected.
301 => match res.headers().get(header::LOCATION) {
Some(val) => {
let original = val.to_str().unwrap();
// We need to strip the .json suffix from the original path.
// In addition, we want to remove share parameters.
// Cut it off here instead of letting it propagate all the way
// to main.rs
let stripped_uri = original.strip_suffix(".json").unwrap_or(original).split('?').next().unwrap_or_default();
// The reason why we now have to format_url, is because the new OAuth
// endpoints seem to return full paths, instead of relative paths.
// So we need to strip the .json suffix from the original path, and
// also remove all Reddit domain parts with format_url.
// Otherwise, it will literally redirect to Reddit.com.
let uri = format_url(stripped_uri);
Ok(Some(uri))
}
None => Ok(None),
},
// If Reddit responds with anything other than 3xx (except for the 2xx and 301
// as above), return a None.
300..=399 => Ok(None),
_ => Ok(
res
.headers()
.get(header::LOCATION)
.map(|val| percent_encode(val.as_bytes(), CONTROLS).to_string().trim_start_matches(REDDIT_URL_BASE).to_string()),
),
}
// If Reddit responds with anything other than 3xx (except for the 2xx as
// above), return a None.
if !res.status().to_string().starts_with('3') {
return Ok(None);
}
Ok(
res
.headers()
.get(header::LOCATION)
.map(|val| percent_encode(val.as_bytes(), CONTROLS).to_string().trim_start_matches(REDDIT_URL_BASE).to_string()),
)
}
pub async fn proxy(req: Request<Body>, format: &str) -> Result<Response<Body>, String> {
@ -66,11 +103,8 @@ async fn stream(url: &str, req: &Request<Body>) -> Result<Response<Body>, String
// First parameter is target URL (mandatory).
let uri = url.parse::<Uri>().map_err(|_| "Couldn't parse URL".to_string())?;
// Prepare the HTTPS connector.
let https = hyper_rustls::HttpsConnectorBuilder::new().with_native_roots().https_only().enable_http1().build();
// Build the hyper client from the HTTPS connector.
let client: client::Client<_, hyper::Body> = client::Client::builder().build(https);
let client: client::Client<_, hyper::Body> = CLIENT.clone();
let mut builder = Request::get(uri);
@ -99,6 +133,8 @@ async fn stream(url: &str, req: &Request<Body>) -> Result<Response<Body>, String
rm("x-cdn-server-region");
rm("x-reddit-cdn");
rm("x-reddit-video-features");
rm("Nel");
rm("Report-To");
res
})
@ -123,24 +159,41 @@ fn request(method: &'static Method, path: String, redirect: bool, quarantine: bo
// Build Reddit URL from path.
let url = format!("{}{}", REDDIT_URL_BASE, path);
// Prepare the HTTPS connector.
let https = hyper_rustls::HttpsConnectorBuilder::new().with_native_roots().https_or_http().enable_http1().build();
// Construct the hyper client from the HTTPS connector.
let client: client::Client<_, hyper::Body> = client::Client::builder().build(https);
let client: client::Client<_, hyper::Body> = CLIENT.clone();
let (token, vendor_id, device_id, user_agent, loid) = {
let client = block_on(OAUTH_CLIENT.read());
(
client.token.clone(),
client.headers_map.get("Client-Vendor-Id").cloned().unwrap_or_default(),
client.headers_map.get("X-Reddit-Device-Id").cloned().unwrap_or_default(),
client.headers_map.get("User-Agent").cloned().unwrap_or_default(),
client.headers_map.get("x-reddit-loid").cloned().unwrap_or_default(),
)
};
// Build request to Reddit. When making a GET, request gzip compression.
// (Reddit doesn't do brotli yet.)
let builder = Request::builder()
.method(method)
.uri(&url)
.header("User-Agent", format!("web:libreddit:{}", env!("CARGO_PKG_VERSION")))
.header("Host", "www.reddit.com")
.header("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8")
.header("User-Agent", user_agent)
.header("Client-Vendor-Id", vendor_id)
.header("X-Reddit-Device-Id", device_id)
.header("x-reddit-loid", loid)
.header("Host", "oauth.reddit.com")
.header("Authorization", &format!("Bearer {}", token))
.header("Accept-Encoding", if method == Method::GET { "gzip" } else { "identity" })
.header("Accept-Language", "en-US,en;q=0.5")
.header("Connection", "keep-alive")
.header("Cookie", if quarantine { "_options=%7B%22pref_quarantine_optin%22%3A%20true%7D" } else { "" })
.header(
"Cookie",
if quarantine {
"_options=%7B%22pref_quarantine_optin%22%3A%20true%2C%20%22pref_gated_sr_optin%22%3A%20true%7D"
} else {
""
},
)
.body(Body::empty());
async move {
@ -149,7 +202,7 @@ fn request(method: &'static Method, path: String, redirect: bool, quarantine: bo
Ok(mut response) => {
// Reddit may respond with a 3xx. Decide whether or not to
// redirect based on caller params.
if response.status().to_string().starts_with('3') {
if response.status().is_redirection() {
if !redirect {
return Ok(response);
};
@ -294,3 +347,24 @@ pub async fn json(path: String, quarantine: bool) -> Result<Value, String> {
Err(e) => err("Couldn't send request to Reddit", e),
}
}
#[tokio::test(flavor = "multi_thread")]
async fn test_localization_popular() {
let val = json("/r/popular/hot.json?&raw_json=1&geo_filter=GLOBAL".to_string(), false).await.unwrap();
assert_eq!("GLOBAL", val["data"]["geo_filter"].as_str().unwrap());
}
#[tokio::test(flavor = "multi_thread")]
async fn test_obfuscated_share_link() {
let share_link = "/r/rust/s/kPgq8WNHRK".into();
// Correct link without share parameters
let canonical_link = "/r/rust/comments/18t5968/why_use_tuple_struct_over_standard_struct/kfbqlbc".into();
assert_eq!(canonical_path(share_link).await, Ok(Some(canonical_link)));
}
#[tokio::test(flavor = "multi_thread")]
async fn test_share_link_strip_json() {
let link = "/17krzvz".into();
let canonical_link = "/r/nfl/comments/17krzvz/rapoport_sources_former_no_2_overall_pick/".into();
assert_eq!(canonical_path(link).await, Ok(Some(canonical_link)));
}

View File

@ -7,53 +7,90 @@ use std::{env::var, fs::read_to_string};
//
// This is the local static that is initialized at runtime (technically at
// first request) and contains the instance settings.
pub(crate) static CONFIG: Lazy<Config> = Lazy::new(Config::load);
pub static CONFIG: Lazy<Config> = Lazy::new(Config::load);
// This serves as the frontend for the Pushshift API - on removed comments, this URL will
// be the base of a link, to display removed content (on another site).
pub const DEFAULT_PUSHSHIFT_FRONTEND: &str = "www.unddit.com";
/// Stores the configuration parsed from the environment variables and the
/// config file. `Config::Default()` contains None for each setting.
/// When adding more config settings, add it to `Config::load`,
/// `get_setting_from_config`, both below, as well as
/// instance_info::InstanceInfo.to_string(), README.md and app.json.
#[derive(Default, Serialize, Deserialize, Clone)]
#[derive(Default, Serialize, Deserialize, Clone, Debug)]
pub struct Config {
#[serde(rename = "LIBREDDIT_SFW_ONLY")]
#[serde(rename = "REDLIB_SFW_ONLY")]
#[serde(alias = "LIBREDDIT_SFW_ONLY")]
pub(crate) sfw_only: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_THEME")]
#[serde(rename = "REDLIB_DEFAULT_THEME")]
#[serde(alias = "LIBREDDIT_DEFAULT_THEME")]
pub(crate) default_theme: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_FRONT_PAGE")]
#[serde(rename = "REDLIB_DEFAULT_FRONT_PAGE")]
#[serde(alias = "LIBREDDIT_DEFAULT_FRONT_PAGE")]
pub(crate) default_front_page: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_LAYOUT")]
#[serde(rename = "REDLIB_DEFAULT_LAYOUT")]
#[serde(alias = "LIBREDDIT_DEFAULT_LAYOUT")]
pub(crate) default_layout: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_WIDE")]
#[serde(rename = "REDLIB_DEFAULT_WIDE")]
#[serde(alias = "LIBREDDIT_DEFAULT_WIDE")]
pub(crate) default_wide: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_COMMENT_SORT")]
#[serde(rename = "REDLIB_DEFAULT_COMMENT_SORT")]
#[serde(alias = "LIBREDDIT_DEFAULT_COMMENT_SORT")]
pub(crate) default_comment_sort: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_POST_SORT")]
#[serde(rename = "REDLIB_DEFAULT_POST_SORT")]
#[serde(alias = "LIBREDDIT_DEFAULT_POST_SORT")]
pub(crate) default_post_sort: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_SHOW_NSFW")]
#[serde(rename = "REDLIB_DEFAULT_SHOW_NSFW")]
#[serde(alias = "LIBREDDIT_DEFAULT_SHOW_NSFW")]
pub(crate) default_show_nsfw: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_BLUR_NSFW")]
#[serde(rename = "REDLIB_DEFAULT_BLUR_NSFW")]
#[serde(alias = "LIBREDDIT_DEFAULT_BLUR_NSFW")]
pub(crate) default_blur_nsfw: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_USE_HLS")]
#[serde(rename = "REDLIB_DEFAULT_USE_HLS")]
#[serde(alias = "LIBREDDIT_DEFAULT_USE_HLS")]
pub(crate) default_use_hls: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_HIDE_HLS_NOTIFICATION")]
#[serde(rename = "REDLIB_DEFAULT_HIDE_HLS_NOTIFICATION")]
#[serde(alias = "LIBREDDIT_DEFAULT_HIDE_HLS_NOTIFICATION")]
pub(crate) default_hide_hls_notification: Option<String>,
#[serde(rename = "LIBREDDIT_DEFAULT_HIDE_AWARDS")]
#[serde(rename = "REDLIB_DEFAULT_HIDE_AWARDS")]
#[serde(alias = "LIBREDDIT_DEFAULT_HIDE_AWARDS")]
pub(crate) default_hide_awards: Option<String>,
#[serde(rename = "LIBREDDIT_BANNER")]
#[serde(rename = "REDLIB_DEFAULT_HIDE_SCORE")]
#[serde(alias = "LIBREDDIT_DEFAULT_HIDE_SCORE")]
pub(crate) default_hide_score: Option<String>,
#[serde(rename = "REDLIB_DEFAULT_SUBSCRIPTIONS")]
#[serde(alias = "LIBREDDIT_DEFAULT_SUBSCRIPTIONS")]
pub(crate) default_subscriptions: Option<String>,
#[serde(rename = "REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION")]
#[serde(alias = "LIBREDDIT_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION")]
pub(crate) default_disable_visit_reddit_confirmation: Option<String>,
#[serde(rename = "REDLIB_BANNER")]
#[serde(alias = "LIBREDDIT_BANNER")]
pub(crate) banner: Option<String>,
#[serde(rename = "REDLIB_ROBOTS_DISABLE_INDEXING")]
#[serde(alias = "LIBREDDIT_ROBOTS_DISABLE_INDEXING")]
pub(crate) robots_disable_indexing: Option<String>,
#[serde(rename = "REDLIB_PUSHSHIFT_FRONTEND")]
#[serde(alias = "LIBREDDIT_PUSHSHIFT_FRONTEND")]
pub(crate) pushshift: Option<String>,
}
impl Config {
@ -61,52 +98,71 @@ impl Config {
/// In the case that there are no environment variables set and there is no
/// config file, this function returns a Config that contains all None values.
pub fn load() -> Self {
// Read from libreddit.toml config file. If for any reason, it fails, the
// default `Config` is used (all None values)
let config: Config = toml::from_str(&read_to_string("libreddit.toml").unwrap_or_default()).unwrap_or_default();
let load_config = |name: &str| {
let new_file = read_to_string(name);
new_file.ok().and_then(|new_file| toml::from_str::<Self>(&new_file).ok())
};
let config = load_config("redlib.toml").or(load_config("libreddit.toml")).unwrap_or_default();
// This function defines the order of preference - first check for
// environment variables with "LIBREDDIT", then check the config, then if
// both are `None`, return a `None` via the `map_or_else` function
let parse = |key: &str| -> Option<String> { var(key).ok().map_or_else(|| get_setting_from_config(key, &config), Some) };
// environment variables with "REDLIB", then check the legacy LIBREDDIT
// option, then check the config, then if all are `None`, return a `None`
let parse = |key: &str| -> Option<String> {
// Return the first non-`None` value
// If all are `None`, return `None`
let legacy_key = key.replace("REDLIB_", "LIBREDDIT_");
var(key).ok().or(var(legacy_key).ok()).or(get_setting_from_config(key, &config))
};
Self {
sfw_only: parse("LIBREDDIT_SFW_ONLY"),
default_theme: parse("LIBREDDIT_DEFAULT_THEME"),
default_front_page: parse("LIBREDDIT_DEFAULT_FRONT_PAGE"),
default_layout: parse("LIBREDDIT_DEFAULT_LAYOUT"),
default_post_sort: parse("LIBREDDIT_DEFAULT_POST_SORT"),
default_wide: parse("LIBREDDIT_DEFAULT_WIDE"),
default_comment_sort: parse("LIBREDDIT_DEFAULT_COMMENT_SORT"),
default_show_nsfw: parse("LIBREDDIT_DEFAULT_SHOW_NSFW"),
default_blur_nsfw: parse("LIBREDDIT_DEFAULT_BLUR_NSFW"),
default_use_hls: parse("LIBREDDIT_DEFAULT_USE_HLS"),
default_hide_hls_notification: parse("LIBREDDIT_DEFAULT_HIDE_HLS"),
default_hide_awards: parse("LIBREDDIT_DEFAULT_HIDE_AWARDS"),
banner: parse("LIBREDDIT_BANNER"),
sfw_only: parse("REDLIB_SFW_ONLY"),
default_theme: parse("REDLIB_DEFAULT_THEME"),
default_front_page: parse("REDLIB_DEFAULT_FRONT_PAGE"),
default_layout: parse("REDLIB_DEFAULT_LAYOUT"),
default_post_sort: parse("REDLIB_DEFAULT_POST_SORT"),
default_wide: parse("REDLIB_DEFAULT_WIDE"),
default_comment_sort: parse("REDLIB_DEFAULT_COMMENT_SORT"),
default_show_nsfw: parse("REDLIB_DEFAULT_SHOW_NSFW"),
default_blur_nsfw: parse("REDLIB_DEFAULT_BLUR_NSFW"),
default_use_hls: parse("REDLIB_DEFAULT_USE_HLS"),
default_hide_hls_notification: parse("REDLIB_DEFAULT_HIDE_HLS"),
default_hide_awards: parse("REDLIB_DEFAULT_HIDE_AWARDS"),
default_hide_score: parse("REDLIB_DEFAULT_HIDE_SCORE"),
default_subscriptions: parse("REDLIB_DEFAULT_SUBSCRIPTIONS"),
default_disable_visit_reddit_confirmation: parse("REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION"),
banner: parse("REDLIB_BANNER"),
robots_disable_indexing: parse("REDLIB_ROBOTS_DISABLE_INDEXING"),
pushshift: parse("REDLIB_PUSHSHIFT_FRONTEND"),
}
}
}
fn get_setting_from_config(name: &str, config: &Config) -> Option<String> {
match name {
"LIBREDDIT_SFW_ONLY" => config.sfw_only.clone(),
"LIBREDDIT_DEFAULT_THEME" => config.default_theme.clone(),
"LIBREDDIT_DEFAULT_FRONT_PAGE" => config.default_front_page.clone(),
"LIBREDDIT_DEFAULT_LAYOUT" => config.default_layout.clone(),
"LIBREDDIT_DEFAULT_COMMENT_SORT" => config.default_comment_sort.clone(),
"LIBREDDIT_DEFAULT_POST_SORT" => config.default_post_sort.clone(),
"LIBREDDIT_DEFAULT_SHOW_NSFW" => config.default_show_nsfw.clone(),
"LIBREDDIT_DEFAULT_BLUR_NSFW" => config.default_blur_nsfw.clone(),
"LIBREDDIT_DEFAULT_USE_HLS" => config.default_use_hls.clone(),
"LIBREDDIT_DEFAULT_HIDE_HLS_NOTIFICATION" => config.default_hide_hls_notification.clone(),
"LIBREDDIT_DEFAULT_WIDE" => config.default_wide.clone(),
"LIBREDDIT_DEFAULT_HIDE_AWARDS" => config.default_hide_awards.clone(),
"LIBREDDIT_BANNER" => config.banner.clone(),
"REDLIB_SFW_ONLY" => config.sfw_only.clone(),
"REDLIB_DEFAULT_THEME" => config.default_theme.clone(),
"REDLIB_DEFAULT_FRONT_PAGE" => config.default_front_page.clone(),
"REDLIB_DEFAULT_LAYOUT" => config.default_layout.clone(),
"REDLIB_DEFAULT_COMMENT_SORT" => config.default_comment_sort.clone(),
"REDLIB_DEFAULT_POST_SORT" => config.default_post_sort.clone(),
"REDLIB_DEFAULT_SHOW_NSFW" => config.default_show_nsfw.clone(),
"REDLIB_DEFAULT_BLUR_NSFW" => config.default_blur_nsfw.clone(),
"REDLIB_DEFAULT_USE_HLS" => config.default_use_hls.clone(),
"REDLIB_DEFAULT_HIDE_HLS_NOTIFICATION" => config.default_hide_hls_notification.clone(),
"REDLIB_DEFAULT_WIDE" => config.default_wide.clone(),
"REDLIB_DEFAULT_HIDE_AWARDS" => config.default_hide_awards.clone(),
"REDLIB_DEFAULT_HIDE_SCORE" => config.default_hide_score.clone(),
"REDLIB_DEFAULT_SUBSCRIPTIONS" => config.default_subscriptions.clone(),
"REDLIB_DEFAULT_DISABLE_VISIT_REDDIT_CONFIRMATION" => config.default_disable_visit_reddit_confirmation.clone(),
"REDLIB_BANNER" => config.banner.clone(),
"REDLIB_ROBOTS_DISABLE_INDEXING" => config.robots_disable_indexing.clone(),
"REDLIB_PUSHSHIFT_FRONTEND" => config.pushshift.clone(),
_ => None,
}
}
/// Retrieves setting from environment variable or config file.
pub(crate) fn get_setting(name: &str) -> Option<String> {
pub fn get_setting(name: &str) -> Option<String> {
get_setting_from_config(name, &CONFIG)
}
@ -114,7 +170,14 @@ pub(crate) fn get_setting(name: &str) -> Option<String> {
use {sealed_test::prelude::*, std::fs::write};
#[test]
#[sealed_test(env = [("LIBREDDIT_SFW_ONLY", "on")])]
fn test_deserialize() {
// Must handle empty input
let result = toml::from_str::<Config>("");
assert!(result.is_ok(), "Error: {}", result.unwrap_err());
}
#[test]
#[sealed_test(env = [("REDLIB_SFW_ONLY", "on")])]
fn test_env_var() {
assert!(crate::utils::sfw_only())
}
@ -122,23 +185,51 @@ fn test_env_var() {
#[test]
#[sealed_test]
fn test_config() {
let config_to_write = r#"LIBREDDIT_DEFAULT_COMMENT_SORT = "best""#;
write("libreddit.toml", config_to_write).unwrap();
assert_eq!(get_setting("LIBREDDIT_DEFAULT_COMMENT_SORT"), Some("best".into()));
let config_to_write = r#"REDLIB_DEFAULT_COMMENT_SORT = "best""#;
write("redlib.toml", config_to_write).unwrap();
assert_eq!(get_setting("REDLIB_DEFAULT_COMMENT_SORT"), Some("best".into()));
}
#[test]
#[sealed_test(env = [("LIBREDDIT_DEFAULT_COMMENT_SORT", "top")])]
#[sealed_test]
fn test_config_legacy() {
let config_to_write = r#"LIBREDDIT_DEFAULT_COMMENT_SORT = "best""#;
write("libreddit.toml", config_to_write).unwrap();
assert_eq!(get_setting("REDLIB_DEFAULT_COMMENT_SORT"), Some("best".into()));
}
#[test]
#[sealed_test(env = [("LIBREDDIT_SFW_ONLY", "on")])]
fn test_env_var_legacy() {
assert!(crate::utils::sfw_only())
}
#[test]
#[sealed_test(env = [("REDLIB_DEFAULT_COMMENT_SORT", "top")])]
fn test_env_config_precedence() {
let config_to_write = r#"LIBREDDIT_DEFAULT_COMMENT_SORT = "best""#;
write("libreddit.toml", config_to_write).unwrap();
assert_eq!(get_setting("LIBREDDIT_DEFAULT_COMMENT_SORT"), Some("top".into()))
let config_to_write = r#"REDLIB_DEFAULT_COMMENT_SORT = "best""#;
write("redlib.toml", config_to_write).unwrap();
assert_eq!(get_setting("REDLIB_DEFAULT_COMMENT_SORT"), Some("top".into()))
}
#[test]
#[sealed_test(env = [("LIBREDDIT_DEFAULT_COMMENT_SORT", "top")])]
#[sealed_test(env = [("REDLIB_DEFAULT_COMMENT_SORT", "top")])]
fn test_alt_env_config_precedence() {
let config_to_write = r#"LIBREDDIT_DEFAULT_COMMENT_SORT = "best""#;
write("libreddit.toml", config_to_write).unwrap();
assert_eq!(get_setting("LIBREDDIT_DEFAULT_COMMENT_SORT"), Some("top".into()))
let config_to_write = r#"REDLIB_DEFAULT_COMMENT_SORT = "best""#;
write("redlib.toml", config_to_write).unwrap();
assert_eq!(get_setting("REDLIB_DEFAULT_COMMENT_SORT"), Some("top".into()))
}
#[test]
#[sealed_test(env = [("REDLIB_DEFAULT_SUBSCRIPTIONS", "news+bestof")])]
fn test_default_subscriptions() {
assert_eq!(get_setting("REDLIB_DEFAULT_SUBSCRIPTIONS"), Some("news+bestof".into()));
}
#[test]
#[sealed_test]
fn test_pushshift() {
let config_to_write = r#"REDLIB_PUSHSHIFT_FRONTEND = "https://api.pushshift.io""#;
write("redlib.toml", config_to_write).unwrap();
assert!(get_setting("REDLIB_PUSHSHIFT_FRONTEND").is_some());
assert_eq!(get_setting("REDLIB_PUSHSHIFT_FRONTEND"), Some("https://api.pushshift.io".into()));
}

View File

@ -3,7 +3,7 @@
use crate::client::json;
use crate::server::RequestExt;
use crate::subreddit::{can_access_quarantine, quarantine};
use crate::utils::{error, filter_posts, get_filters, nsfw_landing, parse_post, setting, template, Post, Preferences};
use crate::utils::{error, filter_posts, get_filters, nsfw_landing, parse_post, template, Post, Preferences};
use askama::Template;
use hyper::{Body, Request, Response};
@ -67,11 +67,12 @@ pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
Ok(response) => {
let post = parse_post(&response[0]["data"]["children"][0]).await;
let req_url = req.uri().to_string();
// Return landing page if this post if this Reddit deems this post
// NSFW, but we have also disabled the display of NSFW content
// or if the instance is SFW-only.
if post.nsfw && (setting(&req, "show_nsfw") != "on" || crate::utils::sfw_only()) {
return Ok(nsfw_landing(req).await.unwrap_or_default());
// or if the instance is SFW-only
if post.nsfw && crate::utils::should_be_nsfw_gated(&req, &req_url) {
return Ok(nsfw_landing(req, req_url).await.unwrap_or_default());
}
let filters = get_filters(&req);
@ -195,14 +196,13 @@ pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
after = response[1]["data"]["after"].as_str().unwrap_or_default().to_string();
}
}
let url = req.uri().to_string();
template(DuplicatesTemplate {
params: DuplicatesParams { before, after, sort },
post,
duplicates,
prefs: Preferences::new(&req),
url,
url: req_url,
num_posts_filtered,
all_posts_filtered,
})
@ -210,9 +210,9 @@ pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
// Process error.
Err(msg) => {
if msg == "quarantined" {
if msg == "quarantined" || msg == "gated" {
let sub = req.param("sub").unwrap_or_default();
quarantine(req, sub)
quarantine(req, sub, msg)
} else {
error(req, msg).await
}

View File

@ -13,13 +13,13 @@ use time::OffsetDateTime;
// This is the local static that is intialized at runtime (technically at
// the first request to the info endpoint) and contains the data
// retrieved from the info endpoint.
pub(crate) static INSTANCE_INFO: Lazy<InstanceInfo> = Lazy::new(InstanceInfo::new);
pub static INSTANCE_INFO: Lazy<InstanceInfo> = Lazy::new(InstanceInfo::new);
/// Handles instance info endpoint
pub async fn instance_info(req: Request<Body>) -> Result<Response<Body>, String> {
// This will retrieve the extension given, or create a new string - which will
// simply become the last option, an HTML page.
let extension = req.param("extension").unwrap_or(String::new());
let extension = req.param("extension").unwrap_or_default();
let response = match extension.as_str() {
"yaml" | "yml" => info_yaml(),
"txt" => info_txt(),
@ -82,7 +82,8 @@ fn info_html(req: Request<Body>) -> Result<Response<Body>, Error> {
Response::builder().status(200).header("content-type", "text/html; charset=utf8").body(Body::from(message))
}
#[derive(Serialize, Deserialize, Default)]
pub(crate) struct InstanceInfo {
pub struct InstanceInfo {
package_name: String,
crate_version: String,
git_commit: String,
deploy_date: String,
@ -94,6 +95,7 @@ pub(crate) struct InstanceInfo {
impl InstanceInfo {
pub fn new() -> Self {
Self {
package_name: env!("CARGO_PKG_NAME").to_string(),
crate_version: env!("CARGO_PKG_VERSION").to_string(),
git_commit: env!("GIT_HASH").to_string(),
deploy_date: OffsetDateTime::now_local().unwrap_or_else(|_| OffsetDateTime::now_utc()).to_string(),
@ -116,12 +118,15 @@ impl InstanceInfo {
}
container.add_table(
Table::from([
["Package name", &self.package_name],
["Crate version", &self.crate_version],
["Git commit", &self.git_commit],
["Deploy date", &self.deploy_date],
["Deploy timestamp", &self.deploy_unix_ts.to_string()],
["Compile mode", &self.compile_mode],
["SFW only", &convert(&self.config.sfw_only)],
["Pushshift frontend", &convert(&self.config.pushshift)],
//TODO: fallback to crate::config::DEFAULT_PUSHSHIFT_FRONTEND
])
.with_header_row(["Settings"]),
);
@ -129,6 +134,7 @@ impl InstanceInfo {
container.add_table(
Table::from([
["Hide awards", &convert(&self.config.default_hide_awards)],
["Hide score", &convert(&self.config.default_hide_score)],
["Theme", &convert(&self.config.default_theme)],
["Front page", &convert(&self.config.default_front_page)],
["Layout", &convert(&self.config.default_layout)],
@ -139,6 +145,7 @@ impl InstanceInfo {
["Blur NSFW", &convert(&self.config.default_blur_nsfw)],
["Use HLS", &convert(&self.config.default_use_hls)],
["Hide HLS notification", &convert(&self.config.default_hide_hls_notification)],
["Subscriptions", &convert(&self.config.default_subscriptions)],
])
.with_header_row(["Default preferences"]),
);
@ -148,15 +155,18 @@ impl InstanceInfo {
match string_type {
StringType::Raw => {
format!(
"Crate version: {}\n
"Package name: {}\n
Crate version: {}\n
Git commit: {}\n
Deploy date: {}\n
Deploy timestamp: {}\n
Compile mode: {}\n
SFW only: {:?}\n
Pushshift frontend: {:?}\n
Config:\n
Banner: {:?}\n
Hide awards: {:?}\n
SFW only: {:?}\n
Hide score: {:?}\n
Default theme: {:?}\n
Default front page: {:?}\n
Default layout: {:?}\n
@ -166,15 +176,19 @@ impl InstanceInfo {
Default show NSFW: {:?}\n
Default blur NSFW: {:?}\n
Default use HLS: {:?}\n
Default hide HLS notification: {:?}\n",
Default hide HLS notification: {:?}\n
Default subscriptions: {:?}\n",
self.package_name,
self.crate_version,
self.git_commit,
self.deploy_date,
self.deploy_unix_ts,
self.compile_mode,
self.config.sfw_only,
self.config.pushshift,
self.config.banner,
self.config.default_hide_awards,
self.config.sfw_only,
self.config.default_hide_score,
self.config.default_theme,
self.config.default_front_page,
self.config.default_layout,
@ -184,7 +198,8 @@ impl InstanceInfo {
self.config.default_show_nsfw,
self.config.default_blur_nsfw,
self.config.default_use_hls,
self.config.default_hide_hls_notification
self.config.default_hide_hls_notification,
self.config.default_subscriptions,
)
}
StringType::Html => self.to_table(),

View File

@ -6,6 +6,8 @@
mod config;
mod duplicates;
mod instance_info;
mod oauth;
mod oauth_resources;
mod post;
mod search;
mod settings;
@ -21,10 +23,13 @@ use hyper::{header::HeaderValue, Body, Request, Response};
mod client;
use client::{canonical_path, proxy};
use log::info;
use once_cell::sync::Lazy;
use server::RequestExt;
use utils::{error, redirect, ThemeAssets};
use crate::client::OAUTH_CLIENT;
mod server;
// Create Services
@ -108,7 +113,13 @@ async fn style() -> Result<Response<Body>, String> {
#[tokio::main]
async fn main() {
let matches = Command::new("Libreddit")
// Load environment variables
_ = dotenvy::dotenv();
// Initialize logger
pretty_env_logger::init();
let matches = Command::new("Redlib")
.version(env!("CARGO_PKG_VERSION"))
.about("Private front-end for Reddit written in Rust ")
.arg(
@ -155,17 +166,23 @@ async fn main() {
let listener = [address, ":", port].concat();
println!("Starting Libreddit...");
println!("Starting Redlib...");
// Begin constructing a server
let mut app = server::Server::new();
// Force evaluation of statics. In instance_info case, we need to evaluate
// the timestamp so deploy date is accurate - in config case, we need to
// evaluate the configuration to avoid paying penalty at first request.
// the timestamp so deploy date is accurate - in config case, we need to
// evaluate the configuration to avoid paying penalty at first request -
// in OAUTH case, we need to retrieve the token to avoid paying penalty
// at first request
info!("Evaluating config.");
Lazy::force(&config::CONFIG);
info!("Evaluating instance info.");
Lazy::force(&instance_info::INSTANCE_INFO);
info!("Creating OAUTH client.");
Lazy::force(&OAUTH_CLIENT);
// Define default headers (added to all responses)
app.default_headers = headers! {
@ -186,9 +203,21 @@ async fn main() {
app
.at("/manifest.json")
.get(|_| resource(include_str!("../static/manifest.json"), "application/json", false).boxed());
app
.at("/robots.txt")
.get(|_| resource("User-agent: *\nDisallow: /u/\nDisallow: /user/", "text/plain", true).boxed());
app.at("/robots.txt").get(|_| {
resource(
if match config::get_setting("REDLIB_ROBOTS_DISABLE_INDEXING") {
Some(val) => val == "on",
None => false,
} {
"User-agent: *\nDisallow: /"
} else {
"User-agent: *\nDisallow: /u/\nDisallow: /user/"
},
"text/plain",
true,
)
.boxed()
});
app.at("/favicon.ico").get(|_| favicon().boxed());
app.at("/logo.png").get(|_| pwa_logo().boxed());
app.at("/Inter.var.woff2").get(|_| font().boxed());
@ -200,8 +229,11 @@ async fn main() {
app
.at("/hls.min.js")
.get(|_| resource(include_str!("../static/hls.min.js"), "text/javascript", false).boxed());
app
.at("/highlighted.js")
.get(|_| resource(include_str!("../static/highlighted.js"), "text/javascript", false).boxed());
// Proxy media through Libreddit
// Proxy media through Redlib
app.at("/vid/:id/:size").get(|r| proxy(r, "https://v.redd.it/{id}/DASH_{size}").boxed());
app.at("/hls/:id/*path").get(|r| proxy(r, "https://v.redd.it/{id}/{path}").boxed());
app.at("/img/*path").get(|r| proxy(r, "https://i.redd.it/{path}").boxed());
@ -298,6 +330,25 @@ async fn main() {
app.at("/info").get(|r| instance_info::instance_info(r).boxed());
app.at("/info.:extension").get(|r| instance_info::instance_info(r).boxed());
// Handle obfuscated share links.
// Note that this still forces the server to follow the share link to get to the post, so maybe this wants to be updated with a warning before it follow it
app.at("/r/:sub/s/:id").get(|req: Request<Body>| {
Box::pin(async move {
let sub = req.param("sub").unwrap_or_default();
match req.param("id").as_deref() {
// Share link
Some(id) if (8..12).contains(&id.len()) => match canonical_path(format!("/r/{}/s/{}", sub, id)).await {
Ok(Some(path)) => Ok(redirect(path)),
Ok(None) => error(req, "Post ID is invalid. It may point to a post on a community that has been banned.").await,
Err(e) => error(req, e).await,
},
// Error message for unknown pages
_ => error(req, "Nothing here".to_string()).await,
}
})
});
app.at("/:id").get(|req: Request<Body>| {
Box::pin(async move {
match req.param("id").as_deref() {
@ -322,7 +373,7 @@ async fn main() {
// Default service in case no routes match
app.at("/*").get(|req| error(req, "Nothing here".to_string()).boxed());
println!("Running Libreddit v{} on {}!", env!("CARGO_PKG_VERSION"), listener);
println!("Running Redlib v{} on {}!", env!("CARGO_PKG_VERSION"), listener);
let server = app.listen(listener);

197
src/oauth.rs Normal file
View File

@ -0,0 +1,197 @@
use std::{collections::HashMap, time::Duration};
use crate::{
client::{CLIENT, OAUTH_CLIENT},
oauth_resources::ANDROID_APP_VERSION_LIST,
};
use base64::{engine::general_purpose, Engine as _};
use hyper::{client, Body, Method, Request};
use log::info;
use serde_json::json;
static REDDIT_ANDROID_OAUTH_CLIENT_ID: &str = "ohXpoqrZYub1kg";
static AUTH_ENDPOINT: &str = "https://accounts.reddit.com";
// Spoofed client for Android devices
#[derive(Debug, Clone, Default)]
pub struct Oauth {
pub(crate) initial_headers: HashMap<String, String>,
pub(crate) headers_map: HashMap<String, String>,
pub(crate) token: String,
expires_in: u64,
device: Device,
}
impl Oauth {
pub(crate) async fn new() -> Self {
let mut oauth = Self::default();
oauth.login().await;
oauth
}
pub(crate) fn default() -> Self {
// Generate a device to spoof
let device = Device::new();
let headers_map = device.headers.clone();
let initial_headers = device.initial_headers.clone();
// For now, just insert headers - no token request
Self {
headers_map,
initial_headers,
token: String::new(),
expires_in: 0,
device,
}
}
async fn login(&mut self) -> Option<()> {
// Construct URL for OAuth token
let url = format!("{}/api/access_token", AUTH_ENDPOINT);
let mut builder = Request::builder().method(Method::POST).uri(&url);
// Add headers from spoofed client
for (key, value) in self.initial_headers.iter() {
builder = builder.header(key, value);
}
// Set up HTTP Basic Auth - basically just the const OAuth ID's with no password,
// Base64-encoded. https://en.wikipedia.org/wiki/Basic_access_authentication
// This could be constant, but I don't think it's worth it. OAuth ID's can change
// over time and we want to be flexible.
let auth = general_purpose::STANDARD.encode(format!("{}:", self.device.oauth_id));
builder = builder.header("Authorization", format!("Basic {auth}"));
// Set JSON body. I couldn't tell you what this means. But that's what the client sends
let json = json!({
"scopes": ["*","email"]
});
let body = Body::from(json.to_string());
// Build request
let request = builder.body(body).unwrap();
// Send request
let client: client::Client<_, hyper::Body> = CLIENT.clone();
let resp = client.request(request).await.ok()?;
// Parse headers - loid header _should_ be saved sent on subsequent token refreshes.
// Technically it's not needed, but it's easy for Reddit API to check for this.
// It's some kind of header that uniquely identifies the device.
if let Some(header) = resp.headers().get("x-reddit-loid") {
self.headers_map.insert("x-reddit-loid".to_owned(), header.to_str().ok()?.to_string());
}
// Same with x-reddit-session
if let Some(header) = resp.headers().get("x-reddit-session") {
self.headers_map.insert("x-reddit-session".to_owned(), header.to_str().ok()?.to_string());
}
// Serialize response
let body_bytes = hyper::body::to_bytes(resp.into_body()).await.ok()?;
let json: serde_json::Value = serde_json::from_slice(&body_bytes).ok()?;
// Save token and expiry
self.token = json.get("access_token")?.as_str()?.to_string();
self.expires_in = json.get("expires_in")?.as_u64()?;
self.headers_map.insert("Authorization".to_owned(), format!("Bearer {}", self.token));
info!("[✅] Success - Retrieved token \"{}...\", expires in {}", &self.token[..32], self.expires_in);
Some(())
}
async fn refresh(&mut self) -> Option<()> {
// Refresh is actually just a subsequent login with the same headers (without the old token
// or anything). This logic is handled in login, so we just call login again.
let refresh = self.login().await;
info!("Refreshing OAuth token... {}", if refresh.is_some() { "success" } else { "failed" });
refresh
}
}
pub async fn token_daemon() {
// Monitor for refreshing token
loop {
// Get expiry time - be sure to not hold the read lock
let expires_in = { OAUTH_CLIENT.read().await.expires_in };
// sleep for the expiry time minus 2 minutes
let duration = Duration::from_secs(expires_in - 120);
info!("[⏳] Waiting for {duration:?} seconds before refreshing OAuth token...");
tokio::time::sleep(duration).await;
info!("[⌛] {duration:?} Elapsed! Refreshing OAuth token...");
// Refresh token - in its own scope
{
OAUTH_CLIENT.write().await.refresh().await;
}
}
}
#[derive(Debug, Clone, Default)]
struct Device {
oauth_id: String,
initial_headers: HashMap<String, String>,
headers: HashMap<String, String>,
}
impl Device {
fn android() -> Self {
// Generate uuid
let uuid = uuid::Uuid::new_v4().to_string();
// Generate random user-agent
let android_app_version = choose(ANDROID_APP_VERSION_LIST).to_string();
let android_version = fastrand::u8(9..=14);
let android_user_agent = format!("Reddit/{android_app_version}/Android {android_version}");
// Android device headers
let headers = HashMap::from([
("Client-Vendor-Id".into(), uuid.clone()),
("X-Reddit-Device-Id".into(), uuid.clone()),
("User-Agent".into(), android_user_agent),
]);
info!("[🔄] Spoofing Android client with headers: {headers:?}, uuid: \"{uuid}\", and OAuth ID \"{REDDIT_ANDROID_OAUTH_CLIENT_ID}\"");
Self {
oauth_id: REDDIT_ANDROID_OAUTH_CLIENT_ID.to_string(),
headers: headers.clone(),
initial_headers: headers,
}
}
fn new() -> Self {
// See https://github.com/redlib-org/redlib/issues/8
Self::android()
}
}
fn choose<T: Copy>(list: &[T]) -> T {
*fastrand::choose_multiple(list.iter(), 1)[0]
}
#[tokio::test(flavor = "multi_thread")]
async fn test_oauth_client() {
assert!(!OAUTH_CLIENT.read().await.token.is_empty());
}
#[tokio::test(flavor = "multi_thread")]
async fn test_oauth_client_refresh() {
OAUTH_CLIENT.write().await.refresh().await.unwrap();
}
#[tokio::test(flavor = "multi_thread")]
async fn test_oauth_token_exists() {
assert!(!OAUTH_CLIENT.read().await.token.is_empty());
}
#[tokio::test(flavor = "multi_thread")]
async fn test_oauth_headers_len() {
assert!(OAUTH_CLIENT.read().await.headers_map.len() >= 3);
}
#[test]
fn test_creating_device() {
Device::new();
}

235
src/oauth_resources.rs Normal file
View File

@ -0,0 +1,235 @@
// This file was generated by scripts/update_oauth_resources.sh
// Rerun scripts/update_oauth_resources.sh to update this file
// Please do not edit manually
// Filled in with real app versions
pub static _IOS_APP_VERSION_LIST: &[&str; 67] = &[
"Version 2020.0.0/Build 306960",
"Version 2020.10.0/Build 307041",
"Version 2020.10.1/Build 307047",
"Version 2020.1.0/Build 306966",
"Version 2020.11.0/Build 307049",
"Version 2020.11.1/Build 307063",
"Version 2020.12.0/Build 307070",
"Version 2020.13.0/Build 307072",
"Version 2020.13.1/Build 307075",
"Version 2020.14.0/Build 307077",
"Version 2020.14.1/Build 307080",
"Version 2020.15.0/Build 307084",
"Version 2020.16.0/Build 307090",
"Version 2020.17.0/Build 307093",
"Version 2020.19.0/Build 307137",
"Version 2020.20.0/Build 307156",
"Version 2020.20.1/Build 307159",
"Version 2020.2.0/Build 306969",
"Version 2020.21.0/Build 307162",
"Version 2020.21.1/Build 307165",
"Version 2020.22.0/Build 307177",
"Version 2020.22.1/Build 307181",
"Version 2020.23.0/Build 307183",
"Version 2020.24.0/Build 307189",
"Version 2020.25.0/Build 307198",
"Version 2020.26.0/Build 307205",
"Version 2020.26.1/Build 307213",
"Version 2020.27.0/Build 307229",
"Version 2020.28.0/Build 307233",
"Version 2020.29.0/Build 307235",
"Version 2020.30.0/Build 307238",
"Version 2020.3.0/Build 306971",
"Version 2020.31.0/Build 307240",
"Version 2020.31.1/Build 307246",
"Version 2020.32.0/Build 307250",
"Version 2020.33.0/Build 307252",
"Version 2020.34.0/Build 307260",
"Version 2020.35.0/Build 307262",
"Version 2020.36.0/Build 307265",
"Version 2020.37.0/Build 307272",
"Version 2020.38.0/Build 307286",
"Version 2020.39.0/Build 307306",
"Version 2020.4.0/Build 306978",
"Version 2020.5.0/Build 306993",
"Version 2020.5.1/Build 307005",
"Version 2020.6.0/Build 307007",
"Version 2020.7.0/Build 307012",
"Version 2020.8.0/Build 307014",
"Version 2020.8.1/Build 307017",
"Version 2020.9.0/Build 307035",
"Version 2020.9.1/Build 307039",
"Version 2023.18.0/Build 310494",
"Version 2023.19.0/Build 310507",
"Version 2023.20.0/Build 310535",
"Version 2023.21.0/Build 310560",
"Version 2023.22.0/Build 613580",
"Version 2023.23.0/Build 310613",
"Version 2023.23.1/Build 613639",
"Version 2023.24.0/Build 613663",
"Version 2023.25.0/Build 613739",
"Version 2023.26.0/Build 613749",
"Version 2023.27.0/Build 613771",
"Version 2023.28.0/Build 613803",
"Version 2023.28.1/Build 613809",
"Version 2023.29.0/Build 613825",
"Version 2023.30.0/Build 613849",
"Version 2023.31.0/Build 613864",
];
pub static ANDROID_APP_VERSION_LIST: &[&str; 150] = &[
"Version 2023.25.1/Build 1018737",
"Version 2023.26.0/Build 1019073",
"Version 2023.27.0/Build 1031923",
"Version 2023.28.0/Build 1046887",
"Version 2023.29.0/Build 1059855",
"Version 2023.30.0/Build 1078734",
"Version 2023.31.0/Build 1091027",
"Version 2023.32.0/Build 1109919",
"Version 2023.32.1/Build 1114141",
"Version 2023.33.1/Build 1129741",
"Version 2023.34.0/Build 1144243",
"Version 2023.35.0/Build 1157967",
"Version 2023.36.0/Build 1168982",
"Version 2023.37.0/Build 1182743",
"Version 2023.38.0/Build 1198522",
"Version 2023.39.0/Build 1211607",
"Version 2023.39.1/Build 1221505",
"Version 2023.40.0/Build 1221521",
"Version 2023.41.0/Build 1233125",
"Version 2023.41.1/Build 1239615",
"Version 2023.42.0/Build 1245088",
"Version 2023.43.0/Build 1257426",
"Version 2023.44.0/Build 1268622",
"Version 2023.45.0/Build 1281371",
"Version 2023.47.0/Build 1303604",
"Version 2023.48.0/Build 1319123",
"Version 2023.49.0/Build 1321715",
"Version 2023.49.1/Build 1322281",
"Version 2023.50.0/Build 1332338",
"Version 2023.50.1/Build 1345844",
"Version 2023.02.0/Build 717912",
"Version 2023.03.0/Build 729220",
"Version 2023.04.0/Build 744681",
"Version 2023.05.0/Build 755453",
"Version 2023.06.0/Build 775017",
"Version 2023.07.0/Build 788827",
"Version 2023.07.1/Build 790267",
"Version 2023.08.0/Build 798718",
"Version 2023.09.0/Build 812015",
"Version 2023.09.1/Build 816833",
"Version 2023.10.0/Build 821148",
"Version 2023.11.0/Build 830610",
"Version 2023.12.0/Build 841150",
"Version 2023.13.0/Build 852246",
"Version 2023.14.0/Build 861593",
"Version 2023.14.1/Build 864826",
"Version 2023.15.0/Build 870628",
"Version 2023.16.0/Build 883294",
"Version 2023.16.1/Build 886269",
"Version 2023.17.0/Build 896030",
"Version 2023.17.1/Build 900542",
"Version 2023.18.0/Build 911877",
"Version 2023.19.0/Build 927681",
"Version 2023.20.0/Build 943980",
"Version 2023.20.1/Build 946732",
"Version 2023.21.0/Build 956283",
"Version 2023.22.0/Build 968223",
"Version 2023.23.0/Build 983896",
"Version 2023.24.0/Build 998541",
"Version 2023.25.0/Build 1014750",
"Version 2022.24.0/Build 510950",
"Version 2022.24.1/Build 513462",
"Version 2022.25.0/Build 515072",
"Version 2022.25.1/Build 516394",
"Version 2022.25.2/Build 519915",
"Version 2022.26.0/Build 521193",
"Version 2022.27.0/Build 527406",
"Version 2022.27.1/Build 529687",
"Version 2022.28.0/Build 533235",
"Version 2022.30.0/Build 548620",
"Version 2022.31.0/Build 556666",
"Version 2022.31.1/Build 562612",
"Version 2022.32.0/Build 567875",
"Version 2022.33.0/Build 572600",
"Version 2022.34.0/Build 579352",
"Version 2022.35.0/Build 588016",
"Version 2022.35.1/Build 589034",
"Version 2022.36.0/Build 593102",
"Version 2022.37.0/Build 601691",
"Version 2022.38.0/Build 607460",
"Version 2022.39.0/Build 615385",
"Version 2022.39.1/Build 619019",
"Version 2022.40.0/Build 624782",
"Version 2022.41.0/Build 630468",
"Version 2022.41.1/Build 634168",
"Version 2022.42.0/Build 638508",
"Version 2022.43.0/Build 648277",
"Version 2022.44.0/Build 664348",
"Version 2022.45.0/Build 677985",
"Version 2023.01.0/Build 709875",
"Version 2021.45.0/Build 387663",
"Version 2021.46.0/Build 392043",
"Version 2021.47.0/Build 394342",
"Version 2022.10.0/Build 429896",
"Version 2022.1.0/Build 402829",
"Version 2022.11.0/Build 433004",
"Version 2022.12.0/Build 436848",
"Version 2022.13.0/Build 442084",
"Version 2022.13.1/Build 444621",
"Version 2022.14.1/Build 452742",
"Version 2022.15.0/Build 455453",
"Version 2022.16.0/Build 462377",
"Version 2022.17.0/Build 468480",
"Version 2022.18.0/Build 473740",
"Version 2022.19.1/Build 482464",
"Version 2022.20.0/Build 487703",
"Version 2022.2.0/Build 405543",
"Version 2022.21.0/Build 492436",
"Version 2022.22.0/Build 498700",
"Version 2022.23.0/Build 502374",
"Version 2022.23.1/Build 506606",
"Version 2022.3.0/Build 408637",
"Version 2022.4.0/Build 411368",
"Version 2022.5.0/Build 414731",
"Version 2022.6.0/Build 418391",
"Version 2022.6.1/Build 419585",
"Version 2022.6.2/Build 420562",
"Version 2022.7.0/Build 420849",
"Version 2022.8.0/Build 423906",
"Version 2022.9.0/Build 426592",
"Version 2021.17.0/Build 323213",
"Version 2021.18.0/Build 324849",
"Version 2021.19.0/Build 325762",
"Version 2021.20.0/Build 326964",
"Version 2021.21.0/Build 327703",
"Version 2021.21.1/Build 328461",
"Version 2021.22.0/Build 329696",
"Version 2021.23.0/Build 331631",
"Version 2021.24.0/Build 333951",
"Version 2021.25.0/Build 335451",
"Version 2021.26.0/Build 336739",
"Version 2021.27.0/Build 338857",
"Version 2021.28.0/Build 340747",
"Version 2021.29.0/Build 342342",
"Version 2021.30.0/Build 343820",
"Version 2021.31.0/Build 346485",
"Version 2021.32.0/Build 349507",
"Version 2021.33.0/Build 351843",
"Version 2021.34.0/Build 353911",
"Version 2021.35.0/Build 355878",
"Version 2021.36.0/Build 359254",
"Version 2021.36.1/Build 360572",
"Version 2021.37.0/Build 361905",
"Version 2021.38.0/Build 365032",
"Version 2021.39.0/Build 369068",
"Version 2021.39.1/Build 372418",
"Version 2021.41.0/Build 376052",
"Version 2021.42.0/Build 378193",
"Version 2021.43.0/Build 382019",
"Version 2021.44.0/Build 385129",
];
pub static _IOS_OS_VERSION_LIST: &[&str; 8] = &[
"Version 17.0.1 (Build 21A340)",
"Version 17.0.2 (Build 21A350)",
"Version 17.0.3 (Build 21A360)",
"Version 17.1 (Build 21B74)",
"Version 17.1.1 (Build 21B91)",
"Version 17.1.2 (Build 21B101)",
"Version 17.2 (Build 21C62)",
"Version 17.2.1 (Build 21C66)",
];

View File

@ -1,5 +1,6 @@
// CRATES
use crate::client::json;
use crate::config::get_setting;
use crate::server::RequestExt;
use crate::subreddit::{can_access_quarantine, quarantine};
use crate::utils::{
@ -8,6 +9,8 @@ use crate::utils::{
use hyper::{Body, Request, Response};
use askama::Template;
use once_cell::sync::Lazy;
use regex::Regex;
use std::collections::HashSet;
// STRUCTS
@ -20,13 +23,18 @@ struct PostTemplate {
prefs: Preferences,
single_thread: bool,
url: String,
url_without_query: String,
comment_query: String,
}
static COMMENT_SEARCH_CAPTURE: Lazy<Regex> = Lazy::new(|| Regex::new(r#"\?q=(.*)&type=comment"#).unwrap());
pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
// Build Reddit API path
let mut path: String = format!("{}.json?{}&raw_json=1", req.uri().path(), req.uri().query().unwrap_or_default());
let sub = req.param("sub").unwrap_or_default();
let quarantined = can_access_quarantine(&req, &sub);
let url = req.uri().to_string();
// Set sort to sort query parameter
let sort = param(&path, "sort").unwrap_or_else(|| {
@ -56,31 +64,41 @@ pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
// Parse the JSON into Post and Comment structs
let post = parse_post(&response[0]["data"]["children"][0]).await;
let req_url = req.uri().to_string();
// Return landing page if this post if this Reddit deems this post
// NSFW, but we have also disabled the display of NSFW content
// or if the instance is SFW-only.
if post.nsfw && (setting(&req, "show_nsfw") != "on" || crate::utils::sfw_only()) {
return Ok(nsfw_landing(req).await.unwrap_or_default());
if post.nsfw && crate::utils::should_be_nsfw_gated(&req, &req_url) {
return Ok(nsfw_landing(req, req_url).await.unwrap_or_default());
}
let comments = parse_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &req);
let url = req.uri().to_string();
let query = match COMMENT_SEARCH_CAPTURE.captures(&url) {
Some(captures) => captures.get(1).unwrap().as_str().replace("%20", " ").replace('+', " "),
None => String::new(),
};
let comments = match query.as_str() {
"" => parse_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &req),
_ => query_comments(&response[1], &post.permalink, &post.author.name, highlighted_comment, &get_filters(&req), &query, &req),
};
// Use the Post and Comment structs to generate a website to show users
template(PostTemplate {
comments,
post,
url_without_query: url.clone().trim_end_matches(&format!("?q={query}&type=comment")).to_string(),
sort,
prefs: Preferences::new(&req),
single_thread,
url,
url: req_url,
comment_query: query,
})
}
// If the Reddit API returns an error, exit and send error page to user
Err(msg) => {
if msg == "quarantined" {
if msg == "quarantined" || msg == "gated" {
let sub = req.param("sub").unwrap_or_default();
quarantine(req, sub)
quarantine(req, sub, msg)
} else {
error(req, msg).await
}
@ -89,6 +107,7 @@ pub async fn item(req: Request<Body>) -> Result<Response<Body>, String> {
}
// COMMENTS
fn parse_comments(json: &serde_json::Value, post_link: &str, post_author: &str, highlighted_comment: &str, filters: &HashSet<String>, req: &Request<Body>) -> Vec<Comment> {
// Parse the comment JSON into a Vector of Comments
let comments = json["data"]["children"].as_array().map_or(Vec::new(), std::borrow::ToOwned::to_owned);
@ -97,88 +116,138 @@ fn parse_comments(json: &serde_json::Value, post_link: &str, post_author: &str,
comments
.into_iter()
.map(|comment| {
let kind = comment["kind"].as_str().unwrap_or_default().to_string();
let data = &comment["data"];
let unix_time = data["created_utc"].as_f64().unwrap_or_default();
let (rel_time, created) = time(unix_time);
let edited = data["edited"].as_f64().map_or((String::new(), String::new()), time);
let score = data["score"].as_i64().unwrap_or(0);
// If this comment contains replies, handle those too
let replies: Vec<Comment> = if data["replies"].is_object() {
parse_comments(&data["replies"], post_link, post_author, highlighted_comment, filters, req)
} else {
Vec::new()
};
let awards: Awards = Awards::parse(&data["all_awardings"]);
let parent_kind_and_id = val(&comment, "parent_id");
let parent_info = parent_kind_and_id.split('_').collect::<Vec<&str>>();
let id = val(&comment, "id");
let highlighted = id == highlighted_comment;
let body = if (val(&comment, "author") == "[deleted]" && val(&comment, "body") == "[removed]") || val(&comment, "body") == "[ Removed by Reddit ]" {
format!(
"<div class=\"md\"><p>[removed] — <a href=\"https://www.unddit.com{}{}\">view removed comment</a></p></div>",
post_link, id
)
} else {
rewrite_urls(&val(&comment, "body_html"))
};
let author = Author {
name: val(&comment, "author"),
flair: Flair {
flair_parts: FlairPart::parse(
data["author_flair_type"].as_str().unwrap_or_default(),
data["author_flair_richtext"].as_array(),
data["author_flair_text"].as_str(),
),
text: val(&comment, "link_flair_text"),
background_color: val(&comment, "author_flair_background_color"),
foreground_color: val(&comment, "author_flair_text_color"),
},
distinguished: val(&comment, "distinguished"),
};
let is_filtered = filters.contains(&["u_", author.name.as_str()].concat());
// Many subreddits have a default comment posted about the sub's rules etc.
// Many libreddit users do not wish to see this kind of comment by default.
// Reddit does not tell us which users are "bots", so a good heuristic is to
// collapse stickied moderator comments.
let is_moderator_comment = data["distinguished"].as_str().unwrap_or_default() == "moderator";
let is_stickied = data["stickied"].as_bool().unwrap_or_default();
let collapsed = (is_moderator_comment && is_stickied) || is_filtered;
Comment {
id,
kind,
parent_id: parent_info[1].to_string(),
parent_kind: parent_info[0].to_string(),
post_link: post_link.to_string(),
post_author: post_author.to_string(),
body,
author,
score: if data["score_hidden"].as_bool().unwrap_or_default() {
("\u{2022}".to_string(), "Hidden".to_string())
} else {
format_num(score)
},
rel_time,
created,
edited,
replies,
highlighted,
awards,
collapsed,
is_filtered,
prefs: Preferences::new(req),
}
build_comment(&comment, data, replies, post_link, post_author, highlighted_comment, filters, req)
})
.collect()
}
fn query_comments(
json: &serde_json::Value,
post_link: &str,
post_author: &str,
highlighted_comment: &str,
filters: &HashSet<String>,
query: &str,
req: &Request<Body>,
) -> Vec<Comment> {
let comments = json["data"]["children"].as_array().map_or(Vec::new(), std::borrow::ToOwned::to_owned);
let mut results = Vec::new();
comments.into_iter().for_each(|comment| {
let data = &comment["data"];
// If this comment contains replies, handle those too
if data["replies"].is_object() {
results.append(&mut query_comments(&data["replies"], post_link, post_author, highlighted_comment, filters, query, req))
}
let c = build_comment(&comment, data, Vec::new(), post_link, post_author, highlighted_comment, filters, req);
if c.body.to_lowercase().contains(&query.to_lowercase()) {
results.push(c);
}
});
results
}
#[allow(clippy::too_many_arguments)]
fn build_comment(
comment: &serde_json::Value,
data: &serde_json::Value,
replies: Vec<Comment>,
post_link: &str,
post_author: &str,
highlighted_comment: &str,
filters: &HashSet<String>,
req: &Request<Body>,
) -> Comment {
let id = val(comment, "id");
let body = if (val(comment, "author") == "[deleted]" && val(comment, "body") == "[removed]") || val(comment, "body") == "[ Removed by Reddit ]" {
format!(
"<div class=\"md\"><p>[removed] — <a href=\"https://{}{}{}\">view removed comment</a></p></div>",
get_setting("REDLIB_PUSHSHIFT_FRONTEND").unwrap_or(String::from(crate::config::DEFAULT_PUSHSHIFT_FRONTEND)),
post_link,
id
)
} else {
rewrite_urls(&val(comment, "body_html"))
};
let kind = comment["kind"].as_str().unwrap_or_default().to_string();
let unix_time = data["created_utc"].as_f64().unwrap_or_default();
let (rel_time, created) = time(unix_time);
let edited = data["edited"].as_f64().map_or((String::new(), String::new()), time);
let score = data["score"].as_i64().unwrap_or(0);
// The JSON API only provides comments up to some threshold.
// Further comments have to be loaded by subsequent requests.
// The "kind" value will be "more" and the "count"
// shows how many more (sub-)comments exist in the respective nesting level.
// Note that in certain (seemingly random) cases, the count is simply wrong.
let more_count = data["count"].as_i64().unwrap_or_default();
let awards: Awards = Awards::parse(&data["all_awardings"]);
let parent_kind_and_id = val(comment, "parent_id");
let parent_info = parent_kind_and_id.split('_').collect::<Vec<&str>>();
let highlighted = id == highlighted_comment;
let author = Author {
name: val(comment, "author"),
flair: Flair {
flair_parts: FlairPart::parse(
data["author_flair_type"].as_str().unwrap_or_default(),
data["author_flair_richtext"].as_array(),
data["author_flair_text"].as_str(),
),
text: val(comment, "link_flair_text"),
background_color: val(comment, "author_flair_background_color"),
foreground_color: val(comment, "author_flair_text_color"),
},
distinguished: val(comment, "distinguished"),
};
let is_filtered = filters.contains(&["u_", author.name.as_str()].concat());
// Many subreddits have a default comment posted about the sub's rules etc.
// Many Redlib users do not wish to see this kind of comment by default.
// Reddit does not tell us which users are "bots", so a good heuristic is to
// collapse stickied moderator comments.
let is_moderator_comment = data["distinguished"].as_str().unwrap_or_default() == "moderator";
let is_stickied = data["stickied"].as_bool().unwrap_or_default();
let collapsed = (is_moderator_comment && is_stickied) || is_filtered;
Comment {
id,
kind,
parent_id: parent_info[1].to_string(),
parent_kind: parent_info[0].to_string(),
post_link: post_link.to_string(),
post_author: post_author.to_string(),
body,
author,
score: if data["score_hidden"].as_bool().unwrap_or_default() {
("\u{2022}".to_string(), "Hidden".to_string())
} else {
format_num(score)
},
rel_time,
created,
edited,
replies,
highlighted,
awards,
collapsed,
is_filtered,
more_count,
prefs: Preferences::new(req),
}
}

View File

@ -145,9 +145,9 @@ pub async fn find(req: Request<Body>) -> Result<Response<Body>, String> {
})
}
Err(msg) => {
if msg == "quarantined" {
if msg == "quarantined" || msg == "gated" {
let sub = req.param("sub").unwrap_or_default();
quarantine(req, sub)
quarantine(req, sub, msg)
} else {
error(req, msg).await
}

View File

@ -46,11 +46,11 @@ impl CompressionType {
/// Returns a `CompressionType` given a content coding
/// in [RFC 7231](https://datatracker.ietf.org/doc/html/rfc7231#section-5.3.4)
/// format.
fn parse(s: &str) -> Option<CompressionType> {
fn parse(s: &str) -> Option<Self> {
let c = match s {
// Compressors we support.
"gzip" => CompressionType::Gzip,
"br" => CompressionType::Brotli,
"gzip" => Self::Gzip,
"br" => Self::Brotli,
// The wildcard means that we can choose whatever
// compression we prefer. In this case, use the
@ -68,8 +68,8 @@ impl CompressionType {
impl ToString for CompressionType {
fn to_string(&self) -> String {
match self {
CompressionType::Gzip => "gzip".to_string(),
CompressionType::Brotli => "br".to_string(),
Self::Gzip => "gzip".to_string(),
Self::Brotli => "br".to_string(),
_ => String::new(),
}
}
@ -137,7 +137,7 @@ impl RequestExt for Request<Body> {
.to_str()
.unwrap_or_default()
.split("; ")
.map(|cookie| Cookie::parse(cookie).unwrap_or_else(|_| Cookie::named("")))
.map(|cookie| Cookie::parse(cookie).unwrap_or_else(|_| Cookie::from("")))
.collect()
})
}
@ -154,7 +154,7 @@ impl ResponseExt for Response<Body> {
.to_str()
.unwrap_or_default()
.split("; ")
.map(|cookie| Cookie::parse(cookie).unwrap_or_else(|_| Cookie::named("")))
.map(|cookie| Cookie::parse(cookie).unwrap_or_else(|_| Cookie::from("")))
.collect()
})
}
@ -166,7 +166,7 @@ impl ResponseExt for Response<Body> {
}
fn remove_cookie(&mut self, name: String) {
let mut cookie = Cookie::named(name);
let mut cookie = Cookie::from(name);
cookie.set_path("/");
cookie.set_max_age(Duration::seconds(1));
if let Ok(val) = header::HeaderValue::from_str(&cookie.to_string()) {
@ -194,7 +194,7 @@ impl Route<'_> {
impl Server {
pub fn new() -> Self {
Server {
Self {
default_headers: HeaderMap::new(),
router: Router::new(),
}
@ -253,7 +253,7 @@ impl Server {
.boxed()
}
// If there was a routing error
Err(e) => async move { new_boilerplate(def_headers, req_headers, 404, e.into()).await }.boxed(),
Err(e) => new_boilerplate(def_headers, req_headers, 404, e.into()).boxed(),
}
}))
}
@ -347,14 +347,6 @@ fn determine_compressor(accept_encoding: String) -> Option<CompressionType> {
impl PartialOrd for CompressorCandidate {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
// Guard against NAN, both on our end and on the other.
if self.q.is_nan() || other.q.is_nan() {
return None;
};
// f64 and CompressionType are ordered, except in the case
// where the f64 is NAN (which we checked against), so we
// can safely return a Some here.
Some(self.cmp(other))
}
}
@ -379,7 +371,7 @@ fn determine_compressor(accept_encoding: String) -> Option<CompressionType> {
// This loop reads the requested compressors and keeps track of whichever
// one has the highest priority per our heuristic.
for val in accept_encoding.to_string().split(',') {
for val in accept_encoding.split(',') {
let mut q: f64 = 1.0;
// The compressor and q-value (if the latter is defined)

View File

@ -19,7 +19,7 @@ struct SettingsTemplate {
// CONSTANTS
const PREFS: [&str; 13] = [
const PREFS: [&str; 15] = [
"theme",
"front_page",
"layout",
@ -31,7 +31,9 @@ const PREFS: [&str; 13] = [
"use_hls",
"hide_hls_notification",
"autoplay_videos",
"fixed_navbar",
"hide_awards",
"hide_score",
"disable_visit_reddit_confirmation",
];
@ -76,11 +78,11 @@ pub async fn set(req: Request<Body>) -> Result<Response<Body>, String> {
for &name in &PREFS {
match form.get(name) {
Some(value) => response.insert_cookie(
Cookie::build(name.to_owned(), value.clone())
Cookie::build((name.to_owned(), value.clone()))
.path("/")
.http_only(true)
.expires(OffsetDateTime::now_utc() + Duration::weeks(52))
.finish(),
.into(),
),
None => response.remove_cookie(name.to_string()),
};
@ -115,11 +117,11 @@ fn set_cookies_method(req: Request<Body>, remove_cookies: bool) -> Response<Body
for name in [PREFS.to_vec(), vec!["subscriptions", "filters"]].concat() {
match form.get(name) {
Some(value) => response.insert_cookie(
Cookie::build(name.to_owned(), value.clone())
Cookie::build((name.to_owned(), value.clone()))
.path("/")
.http_only(true)
.expires(OffsetDateTime::now_utc() + Duration::weeks(52))
.finish(),
.into(),
),
None => {
if remove_cookies {

View File

@ -6,6 +6,7 @@ use crate::{client::json, server::ResponseExt, RequestExt};
use askama::Template;
use cookie::Cookie;
use hyper::{Body, Request, Response};
use time::{Duration, OffsetDateTime};
// STRUCTS
@ -97,13 +98,19 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
}
};
let req_url = req.uri().to_string();
// Return landing page if this post if this is NSFW community but the user
// has disabled the display of NSFW content or if the instance is SFW-only.
if sub.nsfw && (setting(&req, "show_nsfw") != "on" || crate::utils::sfw_only()) {
return Ok(nsfw_landing(req).await.unwrap_or_default());
if sub.nsfw && crate::utils::should_be_nsfw_gated(&req, &req_url) {
return Ok(nsfw_landing(req, req_url).await.unwrap_or_default());
}
let path = format!("/r/{}/{}.json?{}&raw_json=1", sub_name.clone(), sort, req.uri().query().unwrap_or_default());
let mut params = String::from("&raw_json=1");
if sub_name == "popular" {
params.push_str("&geo_filter=GLOBAL");
}
let path = format!("/r/{sub_name}/{sort}.json?{}{params}", req.uri().query().unwrap_or_default());
let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str()));
let redirect_url = url[1..].replace('?', "%3F").replace('&', "%26").replace('+', "%2B");
let filters = get_filters(&req);
@ -144,7 +151,7 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
})
}
Err(msg) => match msg.as_str() {
"quarantined" => quarantine(req, sub_name),
"quarantined" | "gated" => quarantine(req, sub_name, msg),
"private" => error(req, format!("r/{} is a private community", sub_name)).await,
"banned" => error(req, format!("r/{} has been banned from Reddit", sub_name)).await,
_ => error(req, msg).await,
@ -153,9 +160,9 @@ pub async fn community(req: Request<Body>) -> Result<Response<Body>, String> {
}
}
pub fn quarantine(req: Request<Body>, sub: String) -> Result<Response<Body>, String> {
pub fn quarantine(req: Request<Body>, sub: String, restriction: String) -> Result<Response<Body>, String> {
let wall = WallTemplate {
title: format!("r/{} is quarantined", sub),
title: format!("r/{} is {}", sub, restriction),
msg: "Please click the button below to continue to this subreddit.".to_string(),
url: req.uri().to_string(),
sub,
@ -176,11 +183,11 @@ pub async fn add_quarantine_exception(req: Request<Body>) -> Result<Response<Bod
let redir = param(&format!("?{}", req.uri().query().unwrap_or_default()), "redir").ok_or("Invalid URL")?;
let mut response = redirect(redir);
response.insert_cookie(
Cookie::build(&format!("allow_quaran_{}", subreddit.to_lowercase()), "true")
Cookie::build((&format!("allow_quaran_{}", subreddit.to_lowercase()), "true"))
.path("/")
.http_only(true)
.expires(cookie::Expiration::Session)
.finish(),
.into(),
);
Ok(response)
}
@ -211,19 +218,23 @@ pub async fn subscriptions_filters(req: Request<Body>) -> Result<Response<Body>,
let mut filters = preferences.filters;
// Retrieve list of posts for these subreddits to extract display names
let posts = json(format!("/r/{}/hot.json?raw_json=1", sub), true).await?;
let display_lookup: Vec<(String, &str)> = posts["data"]["children"]
.as_array()
.map(|list| {
list
.iter()
.map(|post| {
let display_name = post["data"]["subreddit"].as_str().unwrap_or_default();
(display_name.to_lowercase(), display_name)
})
.collect::<Vec<_>>()
})
.unwrap_or_default();
let posts = json(format!("/r/{}/hot.json?raw_json=1", sub), true).await;
let display_lookup: Vec<(String, &str)> = match &posts {
Ok(posts) => posts["data"]["children"]
.as_array()
.map(|list| {
list
.iter()
.map(|post| {
let display_name = post["data"]["subreddit"].as_str().unwrap_or_default();
(display_name.to_lowercase(), display_name)
})
.collect::<Vec<_>>()
})
.unwrap_or_default(),
Err(_) => vec![],
};
// Find each subreddit name (separated by '+') in sub parameter
for part in sub.split('+').filter(|x| x != &"") {
@ -237,8 +248,12 @@ pub async fn subscriptions_filters(req: Request<Body>) -> Result<Response<Body>,
} else {
// This subreddit display name isn't known, retrieve it
let path: String = format!("/r/{}/about.json?raw_json=1", part);
display = json(path, true).await?;
display["data"]["display_name"].as_str().ok_or_else(|| "Failed to query subreddit name".to_string())?
display = json(path, true).await;
match &display {
Ok(display) => display["data"]["display_name"].as_str(),
Err(_) => None,
}
.unwrap_or(part)
};
// Modify sub list based on action
@ -280,22 +295,22 @@ pub async fn subscriptions_filters(req: Request<Body>) -> Result<Response<Body>,
response.remove_cookie("subscriptions".to_string());
} else {
response.insert_cookie(
Cookie::build("subscriptions", sub_list.join("+"))
Cookie::build(("subscriptions", sub_list.join("+")))
.path("/")
.http_only(true)
.expires(OffsetDateTime::now_utc() + Duration::weeks(52))
.finish(),
.into(),
);
}
if filters.is_empty() {
response.remove_cookie("filters".to_string());
} else {
response.insert_cookie(
Cookie::build("filters", filters.join("+"))
Cookie::build(("filters", filters.join("+")))
.path("/")
.http_only(true)
.expires(OffsetDateTime::now_utc() + Duration::weeks(52))
.finish(),
.into(),
);
}
@ -323,8 +338,8 @@ pub async fn wiki(req: Request<Body>) -> Result<Response<Body>, String> {
url,
}),
Err(msg) => {
if msg == "quarantined" {
quarantine(req, sub)
if msg == "quarantined" || msg == "gated" {
quarantine(req, sub, msg)
} else {
error(req, msg).await
}
@ -361,8 +376,8 @@ pub async fn sidebar(req: Request<Body>) -> Result<Response<Body>, String> {
url,
}),
Err(msg) => {
if msg == "quarantined" {
quarantine(req, sub)
if msg == "quarantined" || msg == "gated" {
quarantine(req, sub, msg)
} else {
error(req, msg).await
}
@ -433,3 +448,9 @@ async fn subreddit(sub: &str, quarantined: bool) -> Result<Subreddit, String> {
nsfw: res["data"]["over18"].as_bool().unwrap_or_default(),
})
}
#[tokio::test(flavor = "multi_thread")]
async fn test_fetching_subreddit() {
let subreddit = subreddit("rust", false).await;
assert!(subreddit.is_ok());
}

View File

@ -43,18 +43,19 @@ pub async fn profile(req: Request<Body>) -> Result<Response<Body>, String> {
let url = String::from(req.uri().path_and_query().map_or("", |val| val.as_str()));
let redirect_url = url[1..].replace('?', "%3F").replace('&', "%26");
// Retrieve other variables from Libreddit request
// Retrieve other variables from Redlib request
let sort = param(&path, "sort").unwrap_or_default();
let username = req.param("name").unwrap_or_default();
// Retrieve info from user about page.
let user = user(&username).await.unwrap_or_default();
let req_url = req.uri().to_string();
// Return landing page if this post if this Reddit deems this user NSFW,
// but we have also disabled the display of NSFW content or if the instance
// is SFW-only.
if user.nsfw && (setting(&req, "show_nsfw") != "on" || crate::utils::sfw_only()) {
return Ok(nsfw_landing(req).await.unwrap_or_default());
if user.nsfw && crate::utils::should_be_nsfw_gated(&req, &req_url) {
return Ok(nsfw_landing(req, req_url).await.unwrap_or_default());
}
let filters = get_filters(&req);
@ -128,3 +129,10 @@ async fn user(name: &str) -> Result<User, String> {
}
})
}
#[tokio::test(flavor = "multi_thread")]
async fn test_fetching_user() {
let user = user("spez").await;
assert!(user.is_ok());
assert!(user.unwrap().karma > 100);
}

View File

@ -1,3 +1,4 @@
use crate::config::get_setting;
//
// CRATES
//
@ -5,6 +6,7 @@ use crate::{client::json, server::RequestExt};
use askama::Template;
use cookie::Cookie;
use hyper::{Body, Request, Response};
use once_cell::sync::Lazy;
use regex::Regex;
use rust_embed::RustEmbed;
use serde_json::Value;
@ -96,6 +98,61 @@ pub struct Author {
pub distinguished: String,
}
pub struct Poll {
pub poll_options: Vec<PollOption>,
pub voting_end_timestamp: (String, String),
pub total_vote_count: u64,
}
impl Poll {
pub fn parse(poll_data: &Value) -> Option<Self> {
poll_data.as_object()?;
let total_vote_count = poll_data["total_vote_count"].as_u64()?;
// voting_end_timestamp is in the format of milliseconds
let voting_end_timestamp = time(poll_data["voting_end_timestamp"].as_f64()? / 1000.0);
let poll_options = PollOption::parse(&poll_data["options"])?;
Some(Self {
poll_options,
total_vote_count,
voting_end_timestamp,
})
}
pub fn most_votes(&self) -> u64 {
self.poll_options.iter().filter_map(|o| o.vote_count).max().unwrap_or(0)
}
}
pub struct PollOption {
pub id: u64,
pub text: String,
pub vote_count: Option<u64>,
}
impl PollOption {
pub fn parse(options: &Value) -> Option<Vec<Self>> {
Some(
options
.as_array()?
.iter()
.filter_map(|option| {
// For each poll option
// we can't just use as_u64() because "id": String("...") and serde would parse it as None
let id = option["id"].as_str()?.parse::<u64>().ok()?;
let text = option["text"].as_str()?.to_owned();
let vote_count = option["vote_count"].as_u64();
// Construct PollOption items
Some(Self { id, text, vote_count })
})
.collect::<Vec<Self>>(),
)
}
}
// Post flags with nsfw and stickied
pub struct Flags {
pub nsfw: bool,
@ -163,6 +220,9 @@ impl Media {
gallery = GalleryMedia::parse(&data["gallery_data"]["items"], &data["media_metadata"]);
("gallery", &data["url"], None)
} else if data["is_reddit_media_domain"].as_bool().unwrap_or_default() && data["domain"] == "i.redd.it" {
// If this post contains a reddit media (image) URL.
("image", &data["url"], None)
} else {
// If type can't be determined, return url
("link", &data["url"], None)
@ -177,6 +237,8 @@ impl Media {
Self {
url: format_url(url_val.as_str().unwrap_or_default()),
alt_url,
// Note: in the data["is_reddit_media_domain"] path above
// width and height will be 0.
width: source["width"].as_i64().unwrap_or_default(),
height: source["height"].as_i64().unwrap_or_default(),
poster: format_url(source["url"].as_str().unwrap_or_default()),
@ -204,10 +266,17 @@ impl GalleryMedia {
// For each image in gallery
let media_id = item["media_id"].as_str().unwrap_or_default();
let image = &metadata[media_id]["s"];
let image_type = &metadata[media_id]["m"];
let url = if image_type == "image/gif" {
image["gif"].as_str().unwrap_or_default()
} else {
image["u"].as_str().unwrap_or_default()
};
// Construct gallery items
Self {
url: format_url(image["u"].as_str().unwrap_or_default()),
url: format_url(url),
width: image["x"].as_i64().unwrap_or_default(),
height: image["y"].as_i64().unwrap_or_default(),
caption: item["caption"].as_str().unwrap_or_default().to_string(),
@ -226,6 +295,7 @@ pub struct Post {
pub body: String,
pub author: Author,
pub permalink: String,
pub poll: Option<Poll>,
pub score: (String, String),
pub upvote_ratio: i64,
pub post_type: String,
@ -241,6 +311,7 @@ pub struct Post {
pub gallery: Vec<GalleryMedia>,
pub awards: Awards,
pub nsfw: bool,
pub ws_url: String,
}
impl Post {
@ -335,6 +406,7 @@ impl Post {
stickied: data["stickied"].as_bool().unwrap_or_default() || data["pinned"].as_bool().unwrap_or_default(),
},
permalink: val(post, "permalink"),
poll: Poll::parse(&data["poll_data"]),
rel_time,
created,
num_duplicates: post["data"]["num_duplicates"].as_u64().unwrap_or(0),
@ -342,6 +414,7 @@ impl Post {
gallery,
awards,
nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(),
ws_url: val(post, "websocket_url"),
});
}
@ -370,6 +443,7 @@ pub struct Comment {
pub awards: Awards,
pub collapsed: bool,
pub is_filtered: bool,
pub more_count: i64,
pub prefs: Preferences,
}
@ -505,12 +579,14 @@ pub struct Preferences {
pub hide_hls_notification: String,
pub use_hls: String,
pub autoplay_videos: String,
pub fixed_navbar: String,
pub disable_visit_reddit_confirmation: String,
pub comment_sort: String,
pub post_sort: String,
pub subscriptions: Vec<String>,
pub filters: Vec<String>,
pub hide_awards: String,
pub hide_score: String,
}
#[derive(RustEmbed)]
@ -530,21 +606,23 @@ impl Preferences {
}
Self {
available_themes: themes,
theme: setting(&req, "theme"),
front_page: setting(&req, "front_page"),
layout: setting(&req, "layout"),
wide: setting(&req, "wide"),
show_nsfw: setting(&req, "show_nsfw"),
blur_nsfw: setting(&req, "blur_nsfw"),
use_hls: setting(&req, "use_hls"),
hide_hls_notification: setting(&req, "hide_hls_notification"),
autoplay_videos: setting(&req, "autoplay_videos"),
disable_visit_reddit_confirmation: setting(&req, "disable_visit_reddit_confirmation"),
comment_sort: setting(&req, "comment_sort"),
post_sort: setting(&req, "post_sort"),
subscriptions: setting(&req, "subscriptions").split('+').map(String::from).filter(|s| !s.is_empty()).collect(),
filters: setting(&req, "filters").split('+').map(String::from).filter(|s| !s.is_empty()).collect(),
hide_awards: setting(&req, "hide_awards"),
theme: setting(req, "theme"),
front_page: setting(req, "front_page"),
layout: setting(req, "layout"),
wide: setting(req, "wide"),
show_nsfw: setting(req, "show_nsfw"),
blur_nsfw: setting(req, "blur_nsfw"),
use_hls: setting(req, "use_hls"),
hide_hls_notification: setting(req, "hide_hls_notification"),
autoplay_videos: setting(req, "autoplay_videos"),
fixed_navbar: setting_or_default(req, "fixed_navbar", "on".to_string()),
disable_visit_reddit_confirmation: setting(req, "disable_visit_reddit_confirmation"),
comment_sort: setting(req, "comment_sort"),
post_sort: setting(req, "post_sort"),
subscriptions: setting(req, "subscriptions").split('+').map(String::from).filter(|s| !s.is_empty()).collect(),
filters: setting(req, "filters").split('+').map(String::from).filter(|s| !s.is_empty()).collect(),
hide_awards: setting(req, "hide_awards"),
hide_score: setting(req, "hide_score"),
}
}
}
@ -592,9 +670,12 @@ pub async fn parse_post(post: &serde_json::Value) -> Post {
let permalink = val(post, "permalink");
let poll = Poll::parse(&post["data"]["poll_data"]);
let body = if val(post, "removed_by_category") == "moderator" {
format!(
"<div class=\"md\"><p>[removed] — <a href=\"https://www.unddit.com{}\">view removed post</a></p></div>",
"<div class=\"md\"><p>[removed] — <a href=\"https://{}{}\">view removed post</a></p></div>",
get_setting("REDLIB_PUSHSHIFT_FRONTEND").unwrap_or(String::from(crate::config::DEFAULT_PUSHSHIFT_FRONTEND)),
permalink
)
} else {
@ -622,6 +703,7 @@ pub async fn parse_post(post: &serde_json::Value) -> Post {
distinguished: val(post, "distinguished"),
},
permalink,
poll,
score: format_num(score),
upvote_ratio: ratio as i64,
post_type,
@ -659,6 +741,7 @@ pub async fn parse_post(post: &serde_json::Value) -> Post {
gallery,
awards,
nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(),
ws_url: val(post, "websocket_url"),
}
}
@ -686,16 +769,26 @@ pub fn setting(req: &Request<Body>, name: &str) -> String {
.cookie(name)
.unwrap_or_else(|| {
// If there is no cookie for this setting, try receiving a default from the config
if let Some(default) = crate::config::get_setting(&format!("LIBREDDIT_DEFAULT_{}", name.to_uppercase())) {
if let Some(default) = crate::config::get_setting(&format!("REDLIB_DEFAULT_{}", name.to_uppercase())) {
Cookie::new(name, default)
} else {
Cookie::named(name)
Cookie::from(name)
}
})
.value()
.to_string()
}
// Retrieve the value of a setting by name or the default value
pub fn setting_or_default(req: &Request<Body>, name: &str, default: String) -> String {
let value = setting(req, name);
if !value.is_empty() {
value
} else {
default
}
}
// Detect and redirect in the event of a random subreddit
pub async fn catch_random(sub: &str, additional: &str) -> Result<Response<Body>, String> {
if sub == "random" || sub == "randnsfw" {
@ -709,6 +802,21 @@ pub async fn catch_random(sub: &str, additional: &str) -> Result<Response<Body>,
}
}
static REGEX_URL_WWW: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://www\.reddit\.com/(.*)").unwrap());
static REGEX_URL_OLD: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://old\.reddit\.com/(.*)").unwrap());
static REGEX_URL_NP: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://np\.reddit\.com/(.*)").unwrap());
static REGEX_URL_PLAIN: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://reddit\.com/(.*)").unwrap());
static REGEX_URL_VIDEOS: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://v\.redd\.it/(.*)/DASH_([0-9]{2,4}(\.mp4|$|\?source=fallback))").unwrap());
static REGEX_URL_VIDEOS_HLS: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://v\.redd\.it/(.+)/(HLSPlaylist\.m3u8.*)$").unwrap());
static REGEX_URL_IMAGES: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://i\.redd\.it/(.*)").unwrap());
static REGEX_URL_THUMBS_A: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://a\.thumbs\.redditmedia\.com/(.*)").unwrap());
static REGEX_URL_THUMBS_B: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://b\.thumbs\.redditmedia\.com/(.*)").unwrap());
static REGEX_URL_EMOJI: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://emoji\.redditmedia\.com/(.*)/(.*)").unwrap());
static REGEX_URL_PREVIEW: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://preview\.redd\.it/(.*)").unwrap());
static REGEX_URL_EXTERNAL_PREVIEW: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://external\-preview\.redd\.it/(.*)").unwrap());
static REGEX_URL_STYLES: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://styles\.redditmedia\.com/(.*)").unwrap());
static REGEX_URL_STATIC_MEDIA: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://www\.redditstatic\.com/(.*)").unwrap());
// Direct urls to proxy if proxy is enabled
pub fn format_url(url: &str) -> String {
if url.is_empty() || url == "self" || url == "default" || url == "nsfw" || url == "spoiler" {
@ -717,13 +825,11 @@ pub fn format_url(url: &str) -> String {
Url::parse(url).map_or(url.to_string(), |parsed| {
let domain = parsed.domain().unwrap_or_default();
let capture = |regex: &str, format: &str, segments: i16| {
Regex::new(regex).map_or(String::new(), |re| {
re.captures(url).map_or(String::new(), |caps| match segments {
1 => [format, &caps[1]].join(""),
2 => [format, &caps[1], "/", &caps[2]].join(""),
_ => String::new(),
})
let capture = |regex: &Regex, format: &str, segments: i16| {
regex.captures(url).map_or(String::new(), |caps| match segments {
1 => [format, &caps[1]].join(""),
2 => [format, &caps[1], "/", &caps[2]].join(""),
_ => String::new(),
})
};
@ -749,44 +855,51 @@ pub fn format_url(url: &str) -> String {
}
match domain {
"www.reddit.com" => capture(r"https://www\.reddit\.com/(.*)", "/", 1),
"old.reddit.com" => capture(r"https://old\.reddit\.com/(.*)", "/", 1),
"np.reddit.com" => capture(r"https://np\.reddit\.com/(.*)", "/", 1),
"reddit.com" => capture(r"https://reddit\.com/(.*)", "/", 1),
"v.redd.it" => chain!(
capture(r"https://v\.redd\.it/(.*)/DASH_([0-9]{2,4}(\.mp4|$|\?source=fallback))", "/vid/", 2),
capture(r"https://v\.redd\.it/(.+)/(HLSPlaylist\.m3u8.*)$", "/hls/", 2)
),
"i.redd.it" => capture(r"https://i\.redd\.it/(.*)", "/img/", 1),
"a.thumbs.redditmedia.com" => capture(r"https://a\.thumbs\.redditmedia\.com/(.*)", "/thumb/a/", 1),
"b.thumbs.redditmedia.com" => capture(r"https://b\.thumbs\.redditmedia\.com/(.*)", "/thumb/b/", 1),
"emoji.redditmedia.com" => capture(r"https://emoji\.redditmedia\.com/(.*)/(.*)", "/emoji/", 2),
"preview.redd.it" => capture(r"https://preview\.redd\.it/(.*)", "/preview/pre/", 1),
"external-preview.redd.it" => capture(r"https://external\-preview\.redd\.it/(.*)", "/preview/external-pre/", 1),
"styles.redditmedia.com" => capture(r"https://styles\.redditmedia\.com/(.*)", "/style/", 1),
"www.redditstatic.com" => capture(r"https://www\.redditstatic\.com/(.*)", "/static/", 1),
"www.reddit.com" => capture(&REGEX_URL_WWW, "/", 1),
"old.reddit.com" => capture(&REGEX_URL_OLD, "/", 1),
"np.reddit.com" => capture(&REGEX_URL_NP, "/", 1),
"reddit.com" => capture(&REGEX_URL_PLAIN, "/", 1),
"v.redd.it" => chain!(capture(&REGEX_URL_VIDEOS, "/vid/", 2), capture(&REGEX_URL_VIDEOS_HLS, "/hls/", 2)),
"i.redd.it" => capture(&REGEX_URL_IMAGES, "/img/", 1),
"a.thumbs.redditmedia.com" => capture(&REGEX_URL_THUMBS_A, "/thumb/a/", 1),
"b.thumbs.redditmedia.com" => capture(&REGEX_URL_THUMBS_B, "/thumb/b/", 1),
"emoji.redditmedia.com" => capture(&REGEX_URL_EMOJI, "/emoji/", 2),
"preview.redd.it" => capture(&REGEX_URL_PREVIEW, "/preview/pre/", 1),
"external-preview.redd.it" => capture(&REGEX_URL_EXTERNAL_PREVIEW, "/preview/external-pre/", 1),
"styles.redditmedia.com" => capture(&REGEX_URL_STYLES, "/style/", 1),
"www.redditstatic.com" => capture(&REGEX_URL_STATIC_MEDIA, "/static/", 1),
_ => url.to_string(),
}
})
}
}
// Rewrite Reddit links to Libreddit in body of text
// These are links we want to replace in-body
static REDDIT_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r#"href="(https|http|)://(www\.|old\.|np\.|amp\.|new\.|)(reddit\.com|redd\.it)/"#).unwrap());
static REDDIT_PREVIEW_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://(external-preview|preview)\.redd\.it(.*)[^?]").unwrap());
static REDDIT_EMOJI_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r"https?://(www|).redditstatic\.com/(.*)").unwrap());
// Rewrite Reddit links to Redlib in body of text
pub fn rewrite_urls(input_text: &str) -> String {
let text1 = Regex::new(r#"href="(https|http|)://(www\.|old\.|np\.|amp\.|)(reddit\.com|redd\.it)/"#)
.map_or(String::new(), |re| re.replace_all(input_text, r#"href="/"#).to_string())
let text1 =
// Rewrite Reddit links to Redlib
REDDIT_REGEX.replace_all(input_text, r#"href="/"#)
.to_string();
let text1 = REDDIT_EMOJI_REGEX
.replace_all(&text1, format_url(REDDIT_EMOJI_REGEX.find(&text1).map(|x| x.as_str()).unwrap_or_default()))
.to_string()
// Remove (html-encoded) "\" from URLs.
.replace("%5C", "")
.replace('\\', "");
.replace("\\_", "_");
// Rewrite external media previews to Libreddit
Regex::new(r"https://external-preview\.redd\.it(.*)[^?]").map_or(String::new(), |re| {
if re.is_match(&text1) {
re.replace_all(&text1, format_url(re.find(&text1).map(|x| x.as_str()).unwrap_or_default())).to_string()
} else {
text1
}
})
// Rewrite external media previews to Redlib
if REDDIT_PREVIEW_REGEX.is_match(&text1) {
REDDIT_PREVIEW_REGEX
.replace_all(&text1, format_url(REDDIT_PREVIEW_REGEX.find(&text1).map(|x| x.as_str()).unwrap_or_default()))
.to_string()
} else {
text1
}
}
// Format vote count to a string that will be displayed.
@ -807,20 +920,31 @@ pub fn format_num(num: i64) -> (String, String) {
// Parse a relative and absolute time from a UNIX timestamp
pub fn time(created: f64) -> (String, String) {
let time = OffsetDateTime::from_unix_timestamp(created.round() as i64).unwrap_or(OffsetDateTime::UNIX_EPOCH);
let time_delta = OffsetDateTime::now_utc() - time;
let now = OffsetDateTime::now_utc();
let min = time.min(now);
let max = time.max(now);
let time_delta = max - min;
// If the time difference is more than a month, show full date
let rel_time = if time_delta > Duration::days(30) {
let mut rel_time = if time_delta > Duration::days(30) {
time.format(format_description!("[month repr:short] [day] '[year repr:last_two]")).unwrap_or_default()
// Otherwise, show relative date/time
} else if time_delta.whole_days() > 0 {
format!("{}d ago", time_delta.whole_days())
format!("{}d", time_delta.whole_days())
} else if time_delta.whole_hours() > 0 {
format!("{}h ago", time_delta.whole_hours())
format!("{}h", time_delta.whole_hours())
} else {
format!("{}m ago", time_delta.whole_minutes())
format!("{}m", time_delta.whole_minutes())
};
if time_delta <= Duration::days(30) {
if now < time {
rel_time += " left";
} else {
rel_time += " ago";
}
}
(
rel_time,
time
@ -871,7 +995,7 @@ pub async fn error(req: Request<Body>, msg: impl ToString) -> Result<Response<Bo
Ok(Response::builder().status(404).header("content-type", "text/html").body(body.into()).unwrap_or_default())
}
/// Returns true if the config/env variable `LIBREDDIT_SFW_ONLY` carries the
/// Returns true if the config/env variable `REDLIB_SFW_ONLY` carries the
/// value `on`.
///
/// If this variable is set as such, the instance will operate in SFW-only
@ -879,17 +1003,27 @@ pub async fn error(req: Request<Body>, msg: impl ToString) -> Result<Response<Bo
/// subreddits or posts or userpages for users Reddit has deemed NSFW will
/// be denied.
pub fn sfw_only() -> bool {
match crate::config::get_setting("LIBREDDIT_SFW_ONLY") {
match crate::config::get_setting("REDLIB_SFW_ONLY") {
Some(val) => val == "on",
None => false,
}
}
// Determines if a request shoud redirect to a nsfw landing gate.
pub fn should_be_nsfw_gated(req: &Request<Body>, req_url: &str) -> bool {
let sfw_instance = sfw_only();
let gate_nsfw = (setting(req, "show_nsfw") != "on") || sfw_instance;
// Nsfw landing gate should not be bypassed on a sfw only instance,
let bypass_gate = !sfw_instance && req_url.contains("&bypass_nsfw_landing");
gate_nsfw && !bypass_gate
}
/// Renders the landing page for NSFW content when the user has not enabled
/// "show NSFW posts" in settings.
pub async fn nsfw_landing(req: Request<Body>) -> Result<Response<Body>, String> {
pub async fn nsfw_landing(req: Request<Body>, req_url: String) -> Result<Response<Body>, String> {
let res_type: ResourceType;
let url = req.uri().to_string();
// Determine from the request URL if the resource is a subreddit, a user
// page, or a post.
@ -908,7 +1042,7 @@ pub async fn nsfw_landing(req: Request<Body>) -> Result<Response<Body>, String>
res,
res_type,
prefs: Preferences::new(&req),
url,
url: req_url,
}
.render()
.unwrap_or_default();
@ -930,13 +1064,27 @@ mod tests {
}
#[test]
fn rewrite_urls_removes_backslashes() {
let comment_body_html =
r#"<a href=\"https://www.reddit.com/r/linux%5C_gaming/comments/x/just%5C_a%5C_test%5C/\">https://www.reddit.com/r/linux\\_gaming/comments/x/just\\_a\\_test/</a>"#;
fn rewrite_urls_removes_backslashes_and_rewrites_url() {
assert_eq!(
rewrite_urls(comment_body_html),
r#"<a href="https://www.reddit.com/r/linux_gaming/comments/x/just_a_test/">https://www.reddit.com/r/linux_gaming/comments/x/just_a_test/</a>"#
)
rewrite_urls(
"<a href=\"https://new.reddit.com/r/linux%5C_gaming/comments/x/just%5C_a%5C_test%5C/\">https://new.reddit.com/r/linux\\_gaming/comments/x/just\\_a\\_test/</a>"
),
"<a href=\"/r/linux_gaming/comments/x/just_a_test/\">https://new.reddit.com/r/linux_gaming/comments/x/just_a_test/</a>"
);
assert_eq!(
rewrite_urls(
"e.g. &lt;a href=\"https://www.reddit.com/r/linux%5C_gaming/comments/ql9j15/anyone%5C_else%5C_confused%5C_with%5C_linus%5C_linux%5C_issues/\"&gt;https://www.reddit.com/r/linux\\_gaming/comments/ql9j15/anyone\\_else\\_confused\\_with\\_linus\\_linux\\_issues/&lt;/a&gt;"
),
"e.g. &lt;a href=\"/r/linux_gaming/comments/ql9j15/anyone_else_confused_with_linus_linux_issues/\"&gt;https://www.reddit.com/r/linux_gaming/comments/ql9j15/anyone_else_confused_with_linus_linux_issues/&lt;/a&gt;"
);
}
#[test]
fn rewrite_urls_keeps_intentional_backslashes() {
assert_eq!(
rewrite_urls("printf \"\\npolkit.addRule(function(action, subject)"),
"printf \"\\npolkit.addRule(function(action, subject)"
);
}
#[test]
@ -960,6 +1108,10 @@ mod tests {
"/hls/foo/HLSPlaylist.m3u8?a=bar&v=1&f=sd"
);
assert_eq!(format_url("https://www.redditstatic.com/gold/awards/icon/icon.png"), "/static/gold/awards/icon/icon.png");
assert_eq!(
format_url("https://www.redditstatic.com/marketplace-assets/v1/core/emotes/snoomoji_emotes/free_emotes_pack/shrug.gif"),
"/static/marketplace-assets/v1/core/emotes/snoomoji_emotes/free_emotes_pack/shrug.gif"
);
assert_eq!(format_url(""), "");
assert_eq!(format_url("self"), "");
@ -968,3 +1120,33 @@ mod tests {
assert_eq!(format_url("spoiler"), "");
}
}
#[test]
fn test_rewriting_emoji() {
let input = r#"<div class="md"><p>How can you have such hard feelings towards a license? <img src="https://www.redditstatic.com/marketplace-assets/v1/core/emotes/snoomoji_emotes/free_emotes_pack/shrug.gif" width="20" height="20" style="vertical-align:middle"> Let people use what license they want, and BSD is one of the least restrictive ones AFAIK.</p>"#;
let output = r#"<div class="md"><p>How can you have such hard feelings towards a license? <img src="/static/marketplace-assets/v1/core/emotes/snoomoji_emotes/free_emotes_pack/shrug.gif" width="20" height="20" style="vertical-align:middle"> Let people use what license they want, and BSD is one of the least restrictive ones AFAIK.</p>"#;
assert_eq!(rewrite_urls(input), output);
}
#[tokio::test(flavor = "multi_thread")]
async fn test_fetching_subreddit_quarantined() {
let subreddit = Post::fetch("/r/drugs", true).await;
assert!(subreddit.is_ok());
assert!(!subreddit.unwrap().0.is_empty());
}
#[tokio::test(flavor = "multi_thread")]
async fn test_fetching_nsfw_subreddit() {
let subreddit = Post::fetch("/r/randnsfw", false).await;
assert!(subreddit.is_ok());
assert!(!subreddit.unwrap().0.is_empty());
}
#[tokio::test(flavor = "multi_thread")]
async fn test_fetching_ws() {
let subreddit = Post::fetch("/r/popular", false).await;
assert!(subreddit.is_ok());
for post in subreddit.unwrap().0 {
assert!(post.ws_url.starts_with("wss://k8s-lb.wss.redditmedia.com/link/"));
}
}

1
static/highlighted.js Normal file
View File

@ -0,0 +1 @@
document.querySelector('#commentQueryForms').scrollIntoView();

View File

@ -1,10 +1,11 @@
{
"name": "Libreddit",
"short_name": "Libreddit",
"name": "Redlib",
"short_name": "Redlib",
"display": "standalone",
"background_color": "#1f1f1f",
"description": "An alternative private front-end to Reddit",
"theme_color": "#1f1f1f",
"start_url": "/",
"icons": [
{
"src": "logo.png",
@ -20,4 +21,4 @@
"sizes": "32x32"
}
]
}
}

11
static/opensearch.xml Normal file
View File

@ -0,0 +1,11 @@
<OpenSearchDescription xmlns="http://a9.com/-/spec/opensearch/1.1/"
xmlns:moz="http://www.mozilla.org/2006/browser/search/">
<ShortName>Search Redlib</ShortName>
<Description>Search for whatever you want on Redlib, awesome Reddit frontend</Description>
<InputEncoding>UTF-8</InputEncoding>
<Image width="32" height="32" type="image/x-icon">/favicon.ico</Image>
<Url type="text/html" template="/search">
<Param name="q" value="{searchTerms}"/>
</Url>
<moz:SearchForm>/search</moz:SearchForm>
</OpenSearchDescription>

View File

@ -26,6 +26,8 @@
--popup-goback-background: var(--popup-red);
--popup-goback-text: #222;
--popup-border: 1px solid var(--popup-red);
--footer-height: 30px;
}
@font-face {
@ -76,6 +78,21 @@
/* Other themes are located in the "themes" folder */
/* Tokyo Night theme setting */
.tokyoNight {
--accent: #565f89;
--green: #73daca;
--text: #a9b1d6;
--foreground: #24283b;
--background: #1a1b26;
--outside: #24283b;
--post: #1a1b26;
--panel-border: 1px solid #a9b1d6;
--highlighted: #414868;
--visited: #414868;
--shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
}
/* General */
::selection {
@ -98,7 +115,13 @@ pre, form, fieldset, table, th, td, select, input {
body {
background: var(--background);
font-size: 15px;
}
body.fixed_navbar {
padding-top: 60px;
padding-bottom: var(--footer-height);
min-height: calc(100vh - 60px);
position: relative;
}
nav {
@ -117,8 +140,12 @@ nav {
z-index: 2;
top: 0;
padding: 5px 15px;
margin-bottom: 10px;
min-height: 40px;
width: calc(100% - 30px);
}
nav.fixed_navbar {
position: fixed;
}
@ -142,13 +169,7 @@ nav #links svg {
display: none;
}
nav #version {
opacity: 50%;
vertical-align: -2px;
margin-right: 10px;
}
nav #libreddit {
nav #redlib {
vertical-align: -2px;
}
@ -270,6 +291,7 @@ main {
max-width: 1000px;
padding: 10px 20px;
margin: 0 auto;
padding-bottom: 4em;
}
.wide main {
@ -292,22 +314,22 @@ main {
body > footer {
display: flex;
justify-content: center;
margin: 20px;
align-items: center;
width: 100%;
background: var(--post);
position: absolute;
bottom: 0;
}
.info-button {
.footer-button {
align-items: center;
border-radius: .25rem;
box-sizing: border-box;
color: var(--text);
cursor: pointer;
display: inline-flex;
font-size: 150%;
padding: 0.5em;
}
.info-button > a:hover {
text-decoration: none;
padding-left: 1em;
opacity: 0.8;
}
/* / Body footer. */
@ -329,6 +351,7 @@ button {
background: none;
border: none;
font-weight: bold;
cursor: pointer;
}
hr {
@ -379,13 +402,17 @@ aside {
border-radius: 5px;
overflow: hidden;
}
#subreddit, #sidebar { min-width: 350px; }
#user *, #subreddit * { text-align: center; }
#user, #sub_meta, #sidebar_contents { padding: 20px; }
#sidebar, #sidebar_contents { margin-top: 10px; }
#sidebar_label { padding: 10px; }
#sidebar_label, #subreddit_label {
padding: 10px;
text-align: left;
}
#user_icon, #sub_icon {
width: 100px;
@ -540,6 +567,7 @@ select, #search, #sort_options, #listing_options, #inside, #searchbox > *, #sort
select {
background: var(--outside);
transition: 0.2s background;
cursor: pointer;
}
select, #search {
@ -552,6 +580,10 @@ select, #search {
border-radius: 5px 0px 0px 5px;
}
.commentQuery {
background: var(--post);
}
#searchbox {
grid-area: searchbox;
display: flex;
@ -629,6 +661,15 @@ button.submit:hover > svg { stroke: var(--accent); }
background: transparent;
}
#commentQueryForms {
display: flex;
justify-content: space-between;
}
#allCommentsLink {
color: var(--green);
}
#sort, #search_sort {
display: flex;
align-items: center;
@ -752,6 +793,7 @@ a.search_subreddit:hover {
"post_score post_title post_thumbnail" 1fr
"post_score post_media post_thumbnail" auto
"post_score post_body post_thumbnail" auto
"post_score post_poll post_thumbnail" auto
"post_score post_notification post_thumbnail" auto
"post_score post_footer post_thumbnail" auto
/ minmax(40px, auto) minmax(0, 1fr) fit-content(min(20%, 152px));
@ -952,6 +994,44 @@ a.search_subreddit:hover {
overflow-wrap: anywhere;
}
.post_poll {
grid-area: post_poll;
padding: 5px 15px 5px 12px;
}
.poll_option {
position: relative;
margin-right: 15px;
margin-top: 14px;
z-index: 0;
display: flex;
align-items: center;
}
.poll_chart {
padding: 14px 0;
background-color: var(--accent);
opacity: 0.2;
border-radius: 5px;
z-index: -1;
position: absolute;
}
.poll_option span {
margin-left: 8px;
color: var(--text);
}
.poll_option span:nth-of-type(1) {
min-width: 10%;
font-weight: bold;
}
.most_voted {
opacity: 0.45;
width: 100%;
}
/* Used only for text post preview */
.post_preview {
-webkit-mask-image: linear-gradient(180deg,#000 60%,transparent);;
@ -1508,7 +1588,7 @@ td, th {
/* Mobile */
@media screen and (max-width: 800px) {
body { padding-top: 120px }
body.fixed_navbar { padding-top: 120px }
main {
flex-direction: column-reverse;
@ -1547,10 +1627,11 @@ td, th {
#user, #sidebar { margin: 20px 0; }
#logo, #links { margin-bottom: 5px; }
#searchbox { width: calc(100vw - 35px); }
}
@media screen and (max-width: 480px) {
body { padding-top: 100px; }
body.fixed_navbar { padding-top: 100px; }
#version { display: none; }
.post {
@ -1558,6 +1639,7 @@ td, th {
"post_title post_title post_thumbnail" 1fr
"post_media post_media post_thumbnail" auto
"post_body post_body post_thumbnail" auto
"post_poll post_poll post_thumbnail" auto
"post_notification post_notification post_thumbnail" auto
"post_score post_footer post_thumbnail" auto
/ auto 1fr fit-content(min(20%, 152px));
@ -1567,6 +1649,10 @@ td, th {
margin: 5px 0px 20px 15px;
padding: 0;
}
.post_poll {
padding: 5px 15px 10px 12px;
}
.compact .post_score { padding: 0; }
@ -1613,10 +1699,14 @@ td, th {
.popup {
width: auto;
bottom: 10vh;
}
.popup-inner > a, h1, p, img {
width: 100%;
.popup-inner {
max-width: 80%;
}
#commentQueryForms {
display: initial;
justify-content: initial;
}
}

View File

@ -0,0 +1,14 @@
/* Tokyo Night theme setting */
.tokyoNight {
--accent: #565f89;
--green: #73daca;
--text: #a9b1d6;
--foreground: #24283b;
--background: #1a1b26;
--outside: #24283b;
--post: #1a1b26;
--panel-border: 1px solid #a9b1d6;
--highlighted: #414868;
--visited: #414868;
--shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
}

View File

@ -4,20 +4,22 @@
<html lang="en">
<head>
{% block head %}
<title>{% block title %}Libreddit{% endblock %}</title>
<title>{% block title %}Redlib{% endblock %}</title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta name="description" content="View on Libreddit, an alternative private front-end to Reddit.">
<meta name="description" content="View on Redlib, an alternative private front-end to Reddit.">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<!-- General PWA -->
<meta name="theme-color" content="#1F1F1F">
<!-- iOS Application -->
<meta name="apple-mobile-web-app-title" content="Libreddit">
<meta name="apple-mobile-web-app-title" content="Redlib">
<meta name="apple-mobile-web-app-capable" content="yes">
<meta name="apple-mobile-web-app-status-bar-style" content="default">
<!-- Android -->
<meta name="mobile-web-app-capable" content="yes">
<!-- iOS Logo -->
<link href="/touch-icon-iphone.png" rel="apple-touch-icon">
<!-- OpenSearch description file -->
<link rel="search" type="application/opensearchdescription+xml" title="Search Redlib" href="/opensearch.xml">
<!-- PWA Manifest -->
<link rel="manifest" type="application/json" href="/manifest.json">
<link rel="shortcut icon" type="image/x-icon" href="/favicon.ico">
@ -27,12 +29,13 @@
<body class="
{% if prefs.layout != "" %}{{ prefs.layout }}{% endif %}
{% if prefs.wide == "on" %} wide{% endif %}
{% if prefs.theme != "system" %} {{ prefs.theme }}{% endif %}">
{% if prefs.theme != "system" %} {{ prefs.theme }}{% endif %}
{% if prefs.fixed_navbar == "on" %} fixed_navbar{% endif %}">
<!-- NAVIGATION BAR -->
<nav>
<nav class="
{% if prefs.fixed_navbar == "on" %} fixed_navbar{% endif %}">
<div id="logo">
<a id="libreddit" href="/"><span id="lib">lib</span><span id="reddit">reddit.</span></a>
<span id="version">v{{ env!("CARGO_PKG_VERSION") }}</span>
<a id="redlib" href="/"><span id="lib">red</span><span id="reddit">lib.</span></a>
{% block subscriptions %}{% endblock %}
</div>
{% block search %}{% endblock %}
@ -54,13 +57,6 @@
<circle cx="12" cy="12" r="3"/><path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"/>
</svg>
</a>
<a id="code" href="https://github.com/libreddit/libreddit" target="_blank" rel="noopener noreferrer">
<span>code</span>
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title>code</title>
<polyline points="16 18 22 12 16 6"/><polyline points="8 6 2 12 8 18"/>
</svg>
</a>
</div>
</nav>
@ -71,10 +67,16 @@
{% endblock %}
</main>
{% endblock %}
<!-- FOOTER -->
{% block footer %}
<footer>
<div class="info-button">
<a href="/info" title="View instance information">&#x24D8;</a>
<p id="version">v{{ env!("CARGO_PKG_VERSION") }}</p>
<div class="footer-button">
<a href="/info" title="View instance information">ⓘ View instance info</a>
</div>
<div class="footer-button">
<a href="https://github.com/redlib-org/redlib" title="View code on GitHub">&lt;&gt; Code</a>
</div>
</footer>
{% endblock %}

View File

@ -1,12 +1,18 @@
{% import "utils.html" as utils %}
{% if kind == "more" && parent_kind == "t1" %}
<a class="deeper_replies" href="{{ post_link }}{{ parent_id }}">&rarr; More replies</a>
<a class="deeper_replies" href="{{ post_link }}{{ parent_id }}">&rarr; More replies ({{ more_count }})</a>
{% else if kind == "t1" %}
<div id="{{ id }}" class="comment">
<div class="comment_left">
<p class="comment_score" title="{{ score.1 }}">{{ score.0 }}</p>
<div class="line"></div>
<p class="comment_score" title="{{ score.1 }}">
{% if prefs.hide_score != "on" %}
{{ score.0 }}
{% else %}
&#x2022;
{% endif %}
</p>
<div class="line"></div>
</div>
<details class="comment_right" {% if !collapsed || highlighted %}open{% endif %}>
<summary class="comment_data">
@ -35,7 +41,7 @@
<div class="comment_body {% if highlighted %}highlighted{% endif %}">{{ body|safe }}</div>
{% endif %}
<blockquote class="replies">{% for c in replies -%}{{ c.render().unwrap()|safe }}{%- endfor %}
</blockquote>
</bockquote>
</details>
</div>
{% endif %}

View File

@ -82,8 +82,14 @@
{% endif %}
<a href="{{ post.permalink }}">{{ post.title }}</a>{% if post.flags.nsfw %} <small class="nsfw">NSFW</small>{% endif %}
</h2>
<div class="post_score" title="{{ post.score.1 }}">{{ post.score.0 }}<span class="label"> Upvotes</span></div>
<div class="post_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %}
{{ post.score.0 }}
{% else %}
&#x2022;
{% endif %}
<span class="label"> Upvotes</span></div>
<div class="post_footer">
<a href="{{ post.permalink }}" class="post_comments" title="{{ post.comments.1 }} comments">{{ post.comments.0 }} comments</a>
</div>

View File

@ -4,6 +4,8 @@
{% block content %}
<div id="error">
<h1>{{ msg }}</h1>
<h3><a href="https://www.redditstatus.com/">Reddit Status</a></h3>
<br />
<h3>Head back <a href="/">home</a>?</h3>
</div>
{% endblock %}

View File

@ -17,12 +17,14 @@
<p>
{% if crate::utils::sfw_only() %}
This instance of Libreddit is SFW-only.</p>
This instance of Redlib is SFW-only.</p>
{% else %}
Enable "Show NSFW posts" in <a href="/settings">settings</a> to view this {% if res_type == crate::utils::ResourceType::Subreddit %}subreddit{% else if res_type == crate::utils::ResourceType::User %}user's posts or comments{% else if res_type == crate::utils::ResourceType::Post %}post{% endif %}.
Enable "Show NSFW posts" in <a href="/settings">settings</a> to view this {% if res_type == crate::utils::ResourceType::Subreddit %}subreddit{% else if res_type == crate::utils::ResourceType::User %}user's posts or comments{% else if res_type == crate::utils::ResourceType::Post %}post{% endif %}. <br>
{% if res_type == crate::utils::ResourceType::Post %} You can also temporarily bypass this gate and view the post by clicking on this <a href="{{url}}&bypass_nsfw_landing">link</a>.{% endif %}
{% endif %}
</p>
</div>
{% endblock %}
{% block footer %}
{% endblock %}

View File

@ -14,11 +14,11 @@
<meta name="author" content="u/{{ post.author.name }}">
<meta name="title" content="{{ post.title }} - r/{{ post.community }}">
<meta property="og:title" content="{{ post.title }} - r/{{ post.community }}">
<meta property="og:description" content="View on Libreddit, an alternative private front-end to Reddit.">
<meta property="og:description" content="View on Redlib, an alternative private front-end to Reddit.">
<meta property="og:url" content="{{ post.permalink }}">
<meta property="twitter:url" content="{{ post.permalink }}">
<meta property="twitter:title" content="{{ post.title }} - r/{{ post.community }}">
<meta property="twitter:description" content="View on Libreddit, an alternative private front-end to Reddit.">
<meta property="twitter:description" content="View on Redlib, an alternative private front-end to Reddit.">
{% if post.post_type == "image" %}
<meta property="og:type" content="image">
<meta property="og:image" content="{{ post.thumbnail.url }}">
@ -31,6 +31,9 @@
<meta property="og:video:type" content="video/mp4">
{% else %}
<meta property="og:type" content="website">
{% if single_thread %}
<script src="/highlighted.js" defer></script>
{% endif %}
{% endif %}
{% endblock %}
@ -43,18 +46,32 @@
{% call utils::post(post) %}
<!-- SORT FORM -->
<div id="commentQueryForms">
<form id="sort">
<p id="comment_count">{{post.comments.0}} {% if post.comments.0 == "1" %}comment{% else %}comments{% endif %} <span id="sorted_by">sorted by </span></p>
<select name="sort" title="Sort comments by">
<select name="sort" title="Sort comments by" id="commentSortSelect">
{% call utils::options(sort, ["confidence", "top", "new", "controversial", "old"], "confidence") %}
</select><button id="sort_submit" class="submit">
<svg width="15" viewBox="0 0 110 100" fill="none" stroke-width="10" stroke-linecap="round">
<path d="M20 50 H100" />
<path d="M75 15 L100 50 L75 85" />
&rarr;
</svg>
</button>
</form>
</select>
<button id="sort_submit" class="submit">
<svg width="15" viewBox="0 0 110 100" fill="none" stroke-width="10" stroke-linecap="round">
<path d="M20 50 H100" />
<path d="M75 15 L100 50 L75 85" />
&rarr;
</svg>
</button>
</form>
<!-- SEARCH FORM -->
<form id="sort">
<input id="search" class="commentQuery" type="search" name="q" value="{{ comment_query }}" placeholder="Search comments">
<input type="hidden" name="type" value="comment">
</form>
</div>
<div>
{% if comment_query != "" %}
Comments containing "{{ comment_query }}"&nbsp;|&nbsp;<a id="allCommentsLink" href="{{ url_without_query }}">All comments</a>
{% endif %}
</div>
<!-- COMMENTS -->
{% for c in comments -%}

View File

@ -1,7 +1,7 @@
{% extends "base.html" %}
{% import "utils.html" as utils %}
{% block title %}Libreddit: search results - {{ params.q }}{% endblock %}
{% block title %}Redlib: search results - {{ params.q }}{% endblock %}
{% block subscriptions %}
{% call utils::sub_list("") %}
@ -10,7 +10,7 @@
{% block content %}
<div id="column_one">
<form id="search_sort">
<input id="search" type="text" name="q" placeholder="Search" value="{{ params.q|safe }}" title="Search libreddit">
<input id="search" type="text" name="q" placeholder="Search" value="{{ params.q|safe }}" title="Search redlib">
{% if sub != "" %}
<div id="inside">
<input type="checkbox" name="restrict_sr" id="restrict_sr" {% if params.restrict_sr != "" %}checked{% endif %}>
@ -29,7 +29,7 @@
&rarr;
</svg>
</button>
</form>
</form>
{% if !is_filtered %}
{% if subreddits.len() > 0 || params.typed == "sr_user" %}
@ -77,7 +77,13 @@
{% else %}
<div class="comment">
<div class="comment_left">
<p class="comment_score" title="{{ post.score.1 }}">{{ post.score.0 }}</p>
<p class="comment_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %}
{{ post.score.0 }}
{% else %}
&#x2022;
{% endif %}
</p>
<div class="line"></div>
</div>
<details class="comment_right" open>
@ -99,13 +105,13 @@
{% if params.typed != "sr_user" %}
<footer>
{% if params.before != "" %}
<a href="?q={{ params.q }}&restrict_sr={{ params.restrict_sr }}
<a href="?q={{ params.q|safe }}&restrict_sr={{ params.restrict_sr }}
&sort={{ params.sort }}&t={{ params.t }}
&before={{ params.before }}" accesskey="P">PREV</a>
{% endif %}
{% if params.after != "" %}
<a href="?q={{ params.q }}&restrict_sr={{ params.restrict_sr }}
<a href="?q={{ params.q|safe }}&restrict_sr={{ params.restrict_sr }}
&sort={{ params.sort }}&t={{ params.t }}
&after={{ params.after }}" accesskey="N">NEXT</a>
{% endif %}

View File

@ -1,10 +1,10 @@
{% extends "base.html" %}
{% import "utils.html" as utils %}
{% block title %}Libreddit Settings{% endblock %}
{% block title %}Redlib Settings{% endblock %}
{% block search %}
{% call utils::search("".to_owned(), "", "") %}
{% call utils::search("".to_owned(), "") %}
{% endblock %}
{% block content %}
@ -71,11 +71,16 @@
<input type="hidden" value="off" name="autoplay_videos">
<input type="checkbox" name="autoplay_videos" id="autoplay_videos" {% if prefs.autoplay_videos == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="fixed_navbar">Keep navbar fixed</label>
<input type="hidden" value="off" name="fixed_navbar">
<input type="checkbox" name="fixed_navbar" {% if prefs.fixed_navbar == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="use_hls">Use HLS for videos</label>
<details id="feeds">
<summary>Why?</summary>
<div id="feed_list" class="helper">Reddit videos require JavaScript (via HLS.js) to be enabled to be played with audio. Therefore, this toggle lets you either use Libreddit JS-free or utilize this feature.</div>
<div id="feed_list" class="helper">Reddit videos require JavaScript (via HLS.js) to be enabled to be played with audio. Therefore, this toggle lets you either use Redlib JS-free or utilize this feature.</div>
</details>
<input type="hidden" value="off" name="use_hls">
<input type="checkbox" name="use_hls" id="use_hls" {% if prefs.use_hls == "on" %}checked{% endif %}>
@ -90,6 +95,11 @@
<input type="hidden" value="off" name="hide_awards">
<input type="checkbox" name="hide_awards" id="hide_awards" {% if prefs.hide_awards == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="hide_score">Hide score</label>
<input type="hidden" value="off" name="hide_score">
<input type="checkbox" name="hide_score" id="hide_score" {% if prefs.hide_score == "on" %}checked{% endif %}>
</div>
<div class="prefs-group">
<label for="disable_visit_reddit_confirmation">Do not confirm before visiting content on Reddit</label>
<input type="hidden" value="off" name="disable_visit_reddit_confirmation">
@ -132,7 +142,7 @@
<div id="settings_note">
<p><b>Note:</b> settings and subscriptions are saved in browser cookies. Clearing your cookies will reset them.</p><br>
<p>You can restore your current settings and subscriptions after clearing your cookies using <a href="/settings/restore/?theme={{ prefs.theme }}&front_page={{ prefs.front_page }}&layout={{ prefs.layout }}&wide={{ prefs.wide }}&post_sort={{ prefs.post_sort }}&comment_sort={{ prefs.comment_sort }}&show_nsfw={{ prefs.show_nsfw }}&blur_nsfw={{ prefs.blur_nsfw }}&use_hls={{ prefs.use_hls }}&hide_hls_notification={{ prefs.hide_hls_notification }}&hide_awards={{ prefs.hide_awards }}&disable_visit_reddit_confirmation={{ prefs.disable_visit_reddit_confirmation }}&subscriptions={{ prefs.subscriptions.join("%2B") }}&autoplay_videos={{ prefs.autoplay_videos }}&filters={{ prefs.filters.join("%2B") }}">this link</a>.</p>
<p>You can restore your current settings and subscriptions after clearing your cookies using <a href="/settings/restore/?theme={{ prefs.theme }}&front_page={{ prefs.front_page }}&layout={{ prefs.layout }}&wide={{ prefs.wide }}&post_sort={{ prefs.post_sort }}&comment_sort={{ prefs.comment_sort }}&show_nsfw={{ prefs.show_nsfw }}&use_hls={{ prefs.use_hls }}&hide_hls_notification={{ prefs.hide_hls_notification }}&hide_awards={{ prefs.hide_awards }}&fixed_navbar={{ prefs.fixed_navbar }}&subscriptions={{ prefs.subscriptions.join("%2B") }}&filters={{ prefs.filters.join("%2B") }}">this link</a>.</p>
</div>
</div>

View File

@ -4,7 +4,7 @@
{% block title %}
{% if sub.title != "" %}{{ sub.title }}
{% else if sub.name != "" %}{{ sub.name }}
{% else %}Libreddit{% endif %}
{% else %}Redlib{% endif %}
{% endblock %}
{% block search %}
@ -12,7 +12,7 @@
{% endblock %}
{% block subscriptions %}
{% call utils::sub_list(sub.name.as_str(), "wide") %}
{% call utils::sub_list(sub.name.as_str()) %}
{% endblock %}
{% block body %}
@ -88,7 +88,8 @@
<center>(Content from r/{{ sub.name }} has been filtered)</center>
{% endif %}
{% if !sub.name.is_empty() && sub.name != "all" && sub.name != "popular" && !sub.name.contains("+") %}
<div class="panel" id="subreddit">
<details class="panel" id="subreddit" open>
<summary id="subreddit_label">Subreddit</summary>
{% if sub.wiki %}
<div id="top">
<div>Posts</div>
@ -131,7 +132,7 @@
</div>
</div>
</div>
</div>
</details>
<details class="panel" id="sidebar">
<summary id="sidebar_label">Sidebar</summary>
<div id="sidebar_contents">

View File

@ -2,10 +2,10 @@
{% import "utils.html" as utils %}
{% block search %}
{% call utils::search("".to_owned(), "", "") %}
{% call utils::search("".to_owned(), "") %}
{% endblock %}
{% block title %}{{ user.name.replace("u/", "") }} (u/{{ user.name }}) - Libreddit{% endblock %}
{% block title %}{{ user.name.replace("u/", "") }} (u/{{ user.name }}) - Redlib{% endblock %}
{% block subscriptions %}
{% call utils::sub_list("") %}
@ -52,12 +52,18 @@
{% else %}
<div class="comment">
<div class="comment_left">
<p class="comment_score" title="{{ post.score.1 }}">{{ post.score.0 }}</p>
<p class="comment_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %}
{{ post.score.0 }}
{% else %}
&#x2022;
{% endif %}
</p>
<div class="line"></div>
</div>
<details class="comment_right" open>
<summary class="comment_data">
<a class="comment_link" href="{{ post.permalink }}">COMMENT</a>
<a class="comment_link" href="{{ post.permalink }}">Comment on r/{{ post.community }}</a>
<span class="created" title="{{ post.created }}">{{ post.rel_time }}</span>
</summary>
<p class="comment_body">{{ post.body|safe }}</p>

View File

@ -16,7 +16,7 @@
{% macro search(root, search) -%}
<form action="{% if root != "/r/" && !root.is_empty() %}{{ root }}{% endif %}/search" id="searchbox">
<input id="search" type="text" name="q" placeholder="Search" title="Search libreddit" value="{{ search }}">
<input id="search" type="text" name="q" placeholder="Search" title="Search redlib" value="{{ search }}">
{% if root != "/r/" && !root.is_empty() %}
<div id="inside">
<input type="checkbox" name="restrict_sr" id="restrict_sr" checked>
@ -100,6 +100,10 @@
{% if post.post_type == "image" %}
<div class="post_media_content">
<a href="{{ post.media.url }}" class="post_media_image" >
{% if post.media.height == 0 || post.media.width == 0 %}
<!-- i.redd.it images speical case -->
<img width="100%" height="100%" loading="lazy" alt="Post image" src="{{ post.media.url }}"/>
{% else %}
<svg
width="{{ post.media.width }}px"
height="{{ post.media.height }}px"
@ -109,6 +113,7 @@
<img loading="lazy" alt="Post image" src="{{ post.media.url }}"/>
</desc>
</svg>
{% endif %}
</a>
</div>
{% else if post.post_type == "video" || post.post_type == "gif" %}
@ -147,7 +152,13 @@
<!-- POST BODY -->
<div class="post_body">{{ post.body|safe }}</div>
<div class="post_score" title="{{ post.score.1 }}">{{ post.score.0 }}<span class="label"> Upvotes</span></div>
<div class="post_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %}
{{ post.score.0 }}
{% else %}
&#x2022;
{% endif %}
<span class="label"> Upvotes</span></div>
<div class="post_footer">
<ul id="post_links">
<li class="desktop_item"><a href="{{ post.permalink }}">permalink</a></li>
@ -216,7 +227,11 @@
<!-- POST MEDIA/THUMBNAIL -->
{% if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "image" %}
<div class="post_media_content">
<a href="{{ post.media.url }}" class="post_media_image {% if post.media.height / post.media.width < 2 %}short{% endif %}" >
<a href="{{ post.media.url }}" class="post_media_image {% if post.media.height < post.media.width*2 %}short{% endif %}" >
{% if post.media.height == 0 || post.media.width == 0 %}
<!-- i.redd.it images speical case -->
<img width="100%" height="100%" loading="lazy" alt="Post image" src="{{ post.media.url }}"/>
{% else %}
<svg
{%if post.flags.nsfw && prefs.blur_nsfw=="on" %}class="post_nsfw_blur"{% endif %}
width="{{ post.media.width }}px"
@ -227,6 +242,7 @@
<img loading="lazy" alt="Post image" src="{{ post.media.url }}"/>
</desc>
</svg>
{% endif %}
</a>
</div>
{% else if (prefs.layout.is_empty() || prefs.layout == "card") && post.post_type == "gif" %}
@ -267,11 +283,19 @@
<span>{% if post.post_type == "link" %}{{ post.domain }}{% else %}{{ post.post_type }}{% endif %}</span>
</a>
{% endif %}
<div class="post_score" title="{{ post.score.1 }}">{{ post.score.0 }}<span class="label"> Upvotes</span></div>
<div class="post_score" title="{{ post.score.1 }}">
{% if prefs.hide_score != "on" %}
{{ post.score.0 }}
{% else %}
&#x2022;
{% endif %}
<span class="label"> Upvotes</span></div>
<div class="post_body post_preview">
{{ post.body|safe }}
</div>
{% call poll(post) %}
<div class="post_footer">
<a href="{{ post.permalink }}" class="post_comments" title="{{ post.comments.1 }} {% if post.comments.1 == "1" %}comment{% else %}comments{% endif %}">{{ post.comments.0 }} {% if post.comments.1 == "1" %}comment{% else %}comments{% endif %}</a>
</div>
@ -281,7 +305,7 @@
{% macro visit_reddit_confirmation(url) -%}
<div class="popup" id="popup">
<div class="popup-inner">
<h1>You are about to leave Libreddit</h1>
<h1>You are about to leave Redlib</h1>
<p>Do you want to continue?</p>
<p id="reddit_url">https://www.reddit.com{{ url }}</p>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 639.24 563">
@ -299,3 +323,34 @@
</div>
</div>
{%- endmacro %}
{% macro poll(post) -%}
{% match post.poll %}
{% when Some with (poll) %}
{% let widest = poll.most_votes() %}
<div class="post_poll">
<span>{{ poll.total_vote_count }} votes,</span>
<span title="{{ poll.voting_end_timestamp.1 }}">{{ poll.voting_end_timestamp.0 }}</span>
{% for option in poll.poll_options %}
<div class="poll_option">
{# Posts without vote_count (all open polls) will show up without votes.
This is an issue with Reddit API, it doesn't work on Old Reddit either. #}
{% match option.vote_count %}
{% when Some with (vote_count) %}
{% if vote_count.eq(widest) || widest == 0 %}
<div class="poll_chart most_voted"></div>
{% else %}
<div class="poll_chart" style="width: {{ (vote_count * 100) / widest }}%"></div>
{% endif %}
<span>{{ vote_count }}</span>
{% when None %}
<div class="poll_chart most_voted"></div>
<span></span>
{% endmatch %}
<span>{{ option.text }}</span>
</div>
{% endfor %}
</div>
{% when None %}
{% endmatch %}
{%- endmacro %}

View File

@ -3,7 +3,7 @@
{% block title %}
{% if sub != "" %}{{ page }} - {{ sub }}
{% else %}Libreddit{% endif %}
{% else %}Redlib{% endif %}
{% endblock %}
{% block search %}