Compare commits

...

108 Commits

Author SHA1 Message Date
9c1a932214 Clean Up Post Headers 2021-01-04 21:17:19 -08:00
8c0269af1c Fix post tags on mobile 2021-01-04 19:43:35 -08:00
df89c5076e Compact Libreddit Posts on Mobile 2021-01-04 19:26:41 -08:00
f819ad2bc6 Remove CSP "Upgrade Insecure Requests" Header 2021-01-04 10:11:07 -08:00
f5884a5270 Update Screenshot 2021-01-03 21:32:45 -08:00
c046d00060 Handle Unwrapping Errors 2021-01-03 21:31:21 -08:00
5934e34ea0 Merge pull request #30 from moosingin3space/master
Add controversial sort order
2021-01-03 21:15:27 -08:00
463b44ac52 Fix timeframe when sorting by controversial 2021-01-04 05:05:21 +00:00
b40d21e559 Add controversial sort order 2021-01-03 21:00:36 -08:00
a422a74747 Make Design More Compact 2021-01-03 19:44:44 -08:00
4124fa87d3 Correct Readme 2021-01-03 18:24:30 -08:00
1dd0c4ee20 Fix User Icon Proxy 2021-01-03 18:23:57 -08:00
0dd114c166 Post upvote ratio, permalink and reddit link 2021-01-03 13:06:49 -08:00
67090e9b08 Fix Proxied Icons 2021-01-03 10:22:41 -08:00
d97fb49fde Fix post::item IDs 2021-01-02 22:46:02 -08:00
9263b0657f Fix navbar padding 2021-01-02 22:40:22 -08:00
a3384cbaa6 Fix search pages 2021-01-02 22:37:54 -08:00
5d26b5c764 Upgrade Insecure Requests 2021-01-02 20:59:14 -08:00
516403ee47 Fix Readme 2021-01-02 20:59:04 -08:00
5ea504e6e8 Restrict Proxy to Reddit Domains 2021-01-02 20:50:23 -08:00
f49bff9853 Optimize Sequencing 2021-01-02 11:09:26 -08:00
4ec529cdb8 Rewrite Reddit Links to Libreddit 2021-01-02 10:58:21 -08:00
779de6f8af Fix Wiki Titles 2021-01-01 22:34:25 -08:00
0925a9b334 Add Wiki Pages 2021-01-01 22:21:43 -08:00
2f2ed6169d Optimize use of .unwrap() 2021-01-01 15:28:13 -08:00
59ef30c76d Remove .clone() in favor of borrowing 2021-01-01 12:55:09 -08:00
d43b49e7e4 Optimize Rust code with Clippy 2021-01-01 12:33:57 -08:00
64a92195dd Merge pull request #19 from somoso/patch-1
Fix posts overflowing on Safari on iOS
2021-01-01 11:52:21 -08:00
a7925ed62d Fix posts overflowing on Safari on iOS
In Safari, the value `anywhere` is not supported for property `overflow-wrap`. Once changed to `break-word`, it behaves like it does in Chrome and Firefox.
2021-01-01 15:46:36 +00:00
39ba50dada Error Page 2020-12-31 21:03:44 -08:00
bc1b29246d Update Screenshot 2020-12-31 20:23:19 -08:00
2d77a91150 Refactor Page Titles and Add Subreddit/User Titles 2020-12-31 20:21:56 -08:00
93c1db502d Fix Title and Navbar 2020-12-31 16:45:10 -08:00
a6dc7ee043 Rewrite + Searching 2020-12-31 15:54:13 -08:00
c7282520cd Add Focus Indicator 2020-12-30 10:53:27 -08:00
a866c1d068 Update Screenshot for v0.2.3 2020-12-29 19:40:49 -08:00
aa9aad6743 Stickied Posts 2020-12-29 19:01:02 -08:00
f65ee2eb6a Sort Top by Timeframe 2020-12-29 17:11:47 -08:00
44c4341e67 Update README.md 2020-12-29 13:29:24 -08:00
1c886f8003 Merge pull request #10 from StuffNoOneCaresAbout/add-instance
instances: add libreddit.kavin.rocks
2020-12-29 09:53:23 -08:00
b481d26be2 instances: add libreddit.kavin.rocks
And it's onion counterpart.
2020-12-29 12:53:49 +05:30
f00ef59404 Fix proxy-less deployment 2020-12-28 20:49:15 -08:00
3115ff3436 Update README.md 2020-12-28 18:45:46 -08:00
443b198c12 Markdown and Subreddit Sidebars 2020-12-28 18:42:46 -08:00
ac84d8d2db List Instances as a Table 2020-12-27 21:46:39 -08:00
e27cf94fbf Include Comment Lines in User History 2020-12-27 15:37:01 -08:00
68495fb280 Add Pages to User Profiles 2020-12-27 12:36:10 -08:00
bec5c78709 Persist Sort on Subreddit Pages 2020-12-26 12:43:51 -08:00
abfcfdf09e Merge pull request #8 from zachjmurphy/master
Add InsanityWtf Libreddit Instance
2020-12-25 18:55:41 -08:00
dad01749e6 Replace Responsive Feature with Secure 2020-12-25 18:06:33 -08:00
2efb73cee3 Update README.md 2020-12-25 17:09:11 -05:00
ace21b21d5 Redesign User/Subreddit About Boxes 2020-12-23 22:16:04 -08:00
280e16bd7f Fix Subreddit Icons 2020-12-23 20:36:49 -08:00
44d44a529c Add DotHQ Libreddit Instance 2020-12-23 13:34:05 -08:00
0957f2e339 NSFW Support 2020-12-22 18:29:43 -08:00
3516404a5f Update v0.2.2 2020-12-22 09:15:55 -08:00
d96daa335f Add Repl.it as Hosting Method 2020-12-22 09:11:03 -08:00
285d9da26d Further Document Libreddit Privacy 2020-12-22 08:45:21 -08:00
9ab7a72bce Fix Comparison Heading Position 2020-12-21 21:53:54 -08:00
46dd905509 Fix Grammar Mistakes 2020-12-21 21:52:50 -08:00
63d595c67d Revise README 2020-12-21 21:51:18 -08:00
dc0b5f42e6 Update README with Reddit Comparison 2020-12-21 21:40:06 -08:00
9ecbd25488 Reorganize CSS 2020-12-21 21:39:55 -08:00
83816fbcc6 Allow Indexing 2020-12-21 21:39:10 -08:00
11cfbdc3ed More Replies Button 2020-12-21 17:17:40 -08:00
4b7cbb3de2 Fix User Icons 2020-12-21 14:12:53 -08:00
b1a572072c Highlight Post Authors in Comments 2020-12-21 08:38:24 -08:00
b1071e9579 Switch Sorting System to Dropdown 2020-12-20 21:49:31 -08:00
da971f8680 Optimized Nested Comments for Mobile, Added IDs 2020-12-20 20:52:15 -08:00
b596f86cc2 Update Screenshot 2020-12-20 19:05:44 -08:00
3bcf0832a1 Correct README Regarding Multireddits 2020-12-20 17:45:52 -08:00
565f4f23b3 Multireddit Support & Referrer Policy 2020-12-20 17:45:26 -08:00
ef3820a2e1 User Flairs 2020-12-20 11:29:23 -08:00
1678245750 Add Sorting to Short Links 2020-12-20 09:10:37 -08:00
3594b6d41f Fix CSS and CSP 2020-12-19 22:25:00 -08:00
a754d42b9e Enforce Content Security Policy 2020-12-19 21:49:10 -08:00
c7e0234d33 Fix comment hover color 2020-12-19 21:44:30 -08:00
11a9ff53e4 Update README 2020-12-19 21:44:07 -08:00
7b8f694c8c Basic Nested & Collapsible Comments 2020-12-19 19:54:46 -08:00
19dc7de3c5 Update README.md 2020-12-18 16:24:09 -08:00
cd29cfbf29 Update Readme
Specify lack of Windows/MacOS binaries, update AUR installation method, correct Docker comparison
2020-12-16 13:01:11 -08:00
d0ec1fcc43 Update v0.1.11 2020-12-14 16:35:38 -08:00
75bc170eba Rewrite URL Dispatch 2020-12-14 16:35:04 -08:00
148d87fb45 Add Elsewhere Links 2020-12-12 08:57:23 -08:00
5219c919af Refactor Last Commit 2020-12-11 20:36:25 -08:00
5bda103356 Fix Post URL Colors 2020-12-11 20:36:06 -08:00
81274e35d7 Fix Post Body Links 2020-12-08 11:31:01 -08:00
e1962c7b66 Fix Header 2020-12-08 09:58:36 -08:00
528fe15819 Add million support 2020-12-07 11:36:05 -08:00
8509f6e22d Merge branch 'master' of https://github.com/spikecodes/libreddit 2020-12-07 11:20:33 -08:00
77886579f4 Link Post Titles 2020-12-07 11:20:24 -08:00
4f5ba35ddb Merge pull request #6 from Scoder12/feature/improve-actions
Use rust-cache action and prettify workflow yml
2020-12-07 11:07:22 -08:00
c738300bc4 Use rust-cache action and prettify
Run `prettier` on the workflow file
2020-12-07 11:05:00 -08:00
293a4d5c50 Merge pull request #5 from Scoder12/master
Add number format utility
2020-12-07 11:00:38 -08:00
312d162c09 Fix mistakes 2020-12-07 10:53:22 -08:00
9f19d729d1 Add number format utility 2020-12-07 10:32:46 -08:00
6794f7d6ba Show and Log Version 2020-12-05 21:29:25 -08:00
04310c58e0 Mobile Responsive 2020-12-05 20:54:43 -08:00
6def67ddfe Merge branch 'master' of https://github.com/spikecodes/libreddit 2020-12-05 09:53:44 -08:00
c33f7947b0 Update Readme 2020-12-05 09:53:41 -08:00
98d10d6596 Fix rust.yml 2020-12-03 21:08:40 -08:00
863b512718 Create rust.yml 2020-12-03 21:07:01 -08:00
d6971bb9a3 Update dependencies 2020-12-02 13:37:33 -08:00
fc98ca9af9 Update Dependencies 2020-11-30 21:10:39 -08:00
f33af75267 Proxy Thumbnails 2020-11-30 21:10:08 -08:00
759c9fc66b Revert to reqwest 2020-11-30 20:57:15 -08:00
9d78266494 Use base64 for encoding & Upgrade Media Handling 2020-11-30 20:33:55 -08:00
9a6430656d Added Percent Encoding Support 2020-11-29 18:50:29 -08:00
26 changed files with 1931 additions and 18183 deletions

29
.github/workflows/rust.yml vendored Normal file
View File

@ -0,0 +1,29 @@
name: Rust
on:
push:
branches: [master]
pull_request:
branches: [master]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Cache Packages
uses: Swatinem/rust-cache@v1.0.1
- name: Build
run: cargo build --release
- uses: actions/upload-artifact@v2.2.1
name: Upload a Build Artifact
with:
name: libreddit
path: target/release/libreddit

1028
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -3,19 +3,18 @@ name = "libreddit"
description = " Alternative private front-end to Reddit"
license = "AGPL-3.0"
repository = "https://github.com/spikecodes/libreddit"
version = "0.1.7"
version = "0.2.6"
authors = ["spikecodes <19519553+spikecodes@users.noreply.github.com>"]
edition = "2018"
[features]
default = ["proxy"]
proxy = ["actix-web/rustls"]
[dependencies]
actix-web = "3.2.0"
surf = "2.1.0"
base64 = "0.13.0"
actix-web = { version = "3.2.0", features = ["rustls"] }
reqwest = { version = "0.10", default_features = false, features = ["rustls-tls"] }
askama = "0.8.0"
serde = "1.0.117"
serde_json = "1.0"
pulldown-cmark = "0.8.0"
chrono = "0.4.19"
async-recursion = "0.3.1"
url = "2.2.0"
regex = "1"

185
README.md
View File

@ -2,60 +2,135 @@
> An alternative private front-end to Reddit
Libre + Reddit = Libreddit
Libre + Reddit = [Libreddit](https://libredd.it)
- 🚀 Fast: written in Rust for blazing fast speeds and safety
- ☁️ Light: no javascript, no ads, no tracking
- ☁️ Light: no JavaScript, no ads, no tracking
- 🕵 Private: all requests are proxied through the server, including media
- 🔒 Safe: does not rely on Reddit's OAuth-requiring APIs
- 📱 Responsive: works great on mobile!
- 🦺 Safe: does not rely on Reddit OAuth or require a Reddit API Key
- 🔒 Secure: strong [Content Security Policy](https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP) prevents browser requests to Reddit
Think Invidious but for Reddit. Watch your cat videos without being watched.
Like [Invidious](https://github.com/iv-org/invidious) but for Reddit. Browse the coldest takes of [r/unpopularopinion](https://libredd.it/r/unpopularopinion) without being [tracked](#reddit).
## Contents
- [Screenshot](#screenshot)
- [Instances](#instances)
- [About](#about)
- [Elsewhere](#elsewhere)
- [Info](#info)
- [Teddit Comparison](#how-does-it-compare-to-teddit)
- [Comparison](#comparison)
- [Speed](#speed)
- [Privacy](#privacy)
- [Installation](#installation)
- [Cargo](#a-cargo)
- [Docker](#b-docker)
- [AUR](#c-aur)
- [GitHub Releases](#d-github-releases)
- [Repl.it](#e-replit)
- Developing
- [Deployment](#deployment)
- [Building](#building)
## Screenshot
![](https://i.ibb.co/Tgjb3w7/image.png)
## Status
- [x] Hosting
- [x] Instances
- [x] Clearnet instance
- [ ] .onion instance
- [x] Cargo deployment
- [x] Docker deployment
- [x] Subreddits
- [x] Title
- [x] Description
- [x] Posts
- [x] Post sorting
- [x] Posts
- [x] Flairs
- [x] Comments
- [x] Comment sorting
- [ ] Nested comments
- [x] UTC post date
- [x] Image thumbnails
- [x] Embedded images
- [x] Proxied images
- [x] Reddit-hosted video
- [x] Proxied video
- [x] Users
- [x] Username
- [x] Karma
- [x] Description
- [x] Post history
- [x] Comment history
- [ ] Search
- [ ] Post aggregating
- [ ] Comment aggregating
- [ ] Result sorting
![](https://i.ibb.co/6mXqb4G/libreddit-rust.png)
## Instances
- [libredd.it](https://libredd.it) 🇺🇸 (Thank you to [YeapGuy](https://github.com/YeapGuy)!)
- [libreddit.spike.codes](https://libreddit.spike.codes) 🇺🇸
Feel free to [open an issue](https://github.com/spikecodes/libreddit/issues/new) to have your [selfhosted instance](#deployment) listed here!
| Website | Country | Cloudflare |
|-|-|-|
| [libredd.it](https://libredd.it) (official) | 🇺🇸 US | |
| [libreddit.spike.codes](https://libreddit.spike.codes) (official) | 🇺🇸 US | |
| [libreddit.dothq.co](https://libreddit.dothq.co) | 🇺🇸 US | ✅ |
| [libreddit.insanity.wtf](https://libreddit.insanity.wtf) | 🇺🇸 US | ✅ |
| [libreddit.kavin.rocks](https://libreddit.kavin.rocks) | 🇮🇳 IN | ✅ |
| [spjmllawtheisznfs7uryhxumin26ssv2draj7oope3ok3wuhy43eoyd.onion](http://spjmllawtheisznfs7uryhxumin26ssv2draj7oope3ok3wuhy43eoyd.onion) | 🇮🇳 IN | |
A checkmark in the "Cloudflare" category here refers to the use of the reverse proxy, [Cloudflare](https://cloudflare). The checkmark will not be listed for a site which uses Cloudflare DNS but rather the proxying service which grants Cloudflare the ability to monitor traffic to the website.
## About
### Elsewhere
Find Libreddit on...
- 💬 Matrix: [#libreddit:matrix.org](https://matrix.to/#/#libreddit:matrix.org)
- 🐋 Docker: [spikecodes/libreddit](https://hub.docker.com/r/spikecodes/libreddit)
- :octocat: GitHub: [spikecodes/libreddit](https://github.com/spikecodes/libreddit)
- 🦊 GitLab: [spikecodes/libreddit](https://gitlab.com/spikecodes/libreddit)
### Info
Libreddit hopes to provide an easier way to browse Reddit, without the ads, trackers, and bloat. Libreddit was inspired by other alternative front-ends to popular services such as [Invidious](https://github.com/iv-org/invidious) for YouTube, [Nitter](https://github.com/zedeus/nitter) for Twitter, and [Bibliogram](https://sr.ht/~cadence/bibliogram/) for Instagram.
Libreddit currently implements most of Reddit's (signed-out) functionalities but still lacks [a few features](https://github.com/spikecodes/libreddit/issues).
### How does it compare to Teddit?
Teddit is another awesome open source project designed to provide an alternative frontend to Reddit. There is no connection between the two and you're welcome to use whichever one you favor. Competition fosters innovation and Teddit's release has motivated me to build Libreddit into an even more polished product.
If you are looking to compare, the biggest differences I have noticed are:
- Libreddit is themed around Reddit's redesign whereas Teddit appears to stick much closer to Reddit's old design. This may suit some users better as design is always subjective.
- Libreddit is written in [Rust](https://www.rust-lang.org) for speed and memory safety. It uses [Actix Web](https://actix.rs), which was [benchmarked as the fastest web server for single queries](https://www.techempower.com/benchmarks/#hw=ph&test=db).
## Comparison
This section outlines how Libreddit compares to Reddit.
### Speed
Lasted tested December 21, 2020.
Results from Google Lighthouse ([Libreddit Report](https://lighthouse-dot-webdotdevsite.appspot.com/lh/html?url=https%3A%2F%2Flibredd.it), [Reddit Report](https://lighthouse-dot-webdotdevsite.appspot.com/lh/html?url=https%3A%2F%2Fwww.reddit.com%2F)).
| | Libreddit | Reddit |
|---------------------|---------------|-----------|
| Requests | 22 | 70 |
| Resource Size | 135 KiB | 2,222 KiB |
| Time to Interactive | **1.7 s** | **11.5 s**|
### Privacy
#### Reddit
**Logging:** According to Reddit's [privacy policy](https://www.redditinc.com/policies/privacy-policy), they "may [automatically] log information" including:
- IP address
- User-agent string
- Browser type
- Operating system
- Referral URLs
- Device information (e.g., device IDs)
- Device settings
- Pages visited
- Links clicked
- The requested URL
- Search terms
**Location:** The same privacy policy goes on to describe location data may be collected through the use of:
- GPS (consensual)
- Bluetooth (consensual)
- Content associated with a location (consensual)
- Your IP Address
**Cookies:** Reddit's [cookie notice](https://www.redditinc.com/policies/cookies) documents the array of cookies used by Reddit including/regarding:
- Authentication
- Functionality
- Analytics and Performance
- Advertising
- Third-Party Cookies
- Third-Party Site
#### Libreddit
For transparency, I hope to describe all the ways Libreddit handles user privacy.
**Logging:** In production (when running the binary, hosting with docker, or using the official instances), Libreddit logs nothing. When debugging (running from source without `--release`), Libreddit logs post IDs and URL paths fetched to aid troubleshooting but nothing else.
**DNS:** Both official domains (`libredd.it` and `libreddit.spike.codes`) use Cloudflare as the DNS resolver. Though, the sites are not proxied through Cloudflare meaning Cloudflare doesn't have access to user traffic.
**Cookies:** Libreddit uses no cookies currently but eventually, I plan to add a configuration page where users can store an optional cookie to save their preferred theme, default sorting algorithm, or default layout.
**Hosting:** The official instances (`libredd.it` and `libreddit.spike.codes`) are hosted on [Repl.it](https://repl.it/) which monitors usage to prevent abuse. I can understand if this invalidates certain users' threat models and therefore, selfhosting and browsing through Tor are welcomed.
## Installation
@ -81,7 +156,7 @@ docker run -d --name libreddit -p 80:8080 spikecodes/libreddit
### C) AUR
Libreddit is available from the Arch User Repository as [`libreddit-git`](https://aur.archlinux.org/packages/libreddit-git).
For ArchLinux users, Libreddit is available from the AUR as [`libreddit-git`](https://aur.archlinux.org/packages/libreddit-git).
Install:
```
@ -91,8 +166,19 @@ yay -S libreddit-git
### D) GitHub Releases
If you're on Linux and none of these methods work for you, you can grab a Linux binary from [the newest release](https://github.com/spikecodes/libreddit/releases/latest).
Currently, Libreddit does not have Windows or macOS binaries but those will be available soon.
## Deploy an Instance
### E) Repl.it
**Note:** Repl.it is a free option but they are *not* private and are monitor server usage to prevent abuse. If you really need a free and easy setup, this method may work best for you.
1. Create a Repl.it account (see note above)
2. Visit [the official Repl](https://repl.it/@spikethecoder/libreddit) and fork it
3. Hit the run button to download the latest Libreddit version and start it
In the web preview (defaults to top right), you should see your instance hosted where you can assign a [custom domain](https://docs.repl.it/repls/web-hosting#custom-domains).
## Deployment
Once installed, deploy Libreddit (unless you're using Docker) by running:
@ -105,12 +191,7 @@ Specify a custom address for the server by passing the `-a` or `--address` argum
libreddit --address=0.0.0.0:8111
```
To disable the media proxy built into Libreddit, run:
```
libreddit --no-default-features
```
## Building from Source
## Building
```
git clone https://github.com/spikecodes/libreddit

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +1,4 @@
edition = "2018"
tab_spaces = 2
hard_tabs = true
max_width = 200
max_width = 175

View File

@ -1,21 +1,20 @@
// Import Crates
use actix_web::{get, App, HttpResponse, HttpServer};
use actix_web::{get, middleware::NormalizePath, web, App, HttpResponse, HttpServer};
// Reference local files
mod popular;
mod post;
mod proxy;
mod search;
// mod settings;
mod subreddit;
mod user;
mod proxy;
mod utils;
// Create Services
#[get("/style.css")]
async fn style() -> HttpResponse {
HttpResponse::Ok().content_type("text/css").body(include_str!("../static/style.css"))
}
#[get("/robots.txt")]
async fn robots() -> HttpResponse {
HttpResponse::Ok().body(include_str!("../static/robots.txt"))
}
@ -33,35 +32,54 @@ async fn main() -> std::io::Result<()> {
if args.len() > 1 {
for arg in args {
if arg.starts_with("--address=") || arg.starts_with("-a=") {
let split: Vec<&str> = arg.split("=").collect();
let split: Vec<&str> = arg.split('=').collect();
address = split[1].to_string();
}
}
}
// start http server
println!("Running Libreddit on {}!", address.clone());
println!("Running Libreddit v{} on {}!", env!("CARGO_PKG_VERSION"), &address);
HttpServer::new(|| {
App::new()
// TRAILING SLASH MIDDLEWARE
.wrap(NormalizePath::default())
// DEFAULT SERVICE
.default_service(web::get().to(utils::error))
// GENERAL SERVICES
.service(style)
.service(favicon)
.service(robots)
.route("/style.css/", web::get().to(style))
.route("/favicon.ico/", web::get().to(HttpResponse::Ok))
.route("/robots.txt/", web::get().to(robots))
// SETTINGS SERVICE
// .route("/settings/", web::get().to(settings::get))
// .route("/settings/save/", web::post().to(settings::set))
// PROXY SERVICE
.service(proxy::handler)
// POST SERVICES
.service(post::short)
.service(post::page)
// SUBREDDIT SERVICES
.service(subreddit::page)
// POPULAR SERVICES
.service(popular::page)
.route("/proxy/{url:.*}/", web::get().to(proxy::handler))
// SEARCH SERVICES
.route("/search/", web::get().to(search::find))
.route("r/{sub}/search/", web::get().to(search::find))
// USER SERVICES
.service(user::page)
.route("/u/{username}/", web::get().to(user::profile))
.route("/user/{username}/", web::get().to(user::profile))
// WIKI SERVICES
.route("/wiki/", web::get().to(subreddit::wiki))
.route("/wiki/{page}/", web::get().to(subreddit::wiki))
.route("/r/{sub}/wiki/", web::get().to(subreddit::wiki))
.route("/r/{sub}/wiki/{page}/", web::get().to(subreddit::wiki))
// SUBREDDIT SERVICES
.route("/r/{sub}/", web::get().to(subreddit::page))
.route("/r/{sub}/{sort:hot|new|top|rising|controversial}/", web::get().to(subreddit::page))
// POPULAR SERVICES
.route("/", web::get().to(subreddit::page))
.route("/{sort:best|hot|new|top|rising|controversial}/", web::get().to(subreddit::page))
// POST SERVICES
.route("/{id:.{5,6}}/", web::get().to(post::item))
.route("/r/{sub}/comments/{id}/{title}/", web::get().to(post::item))
.route("/r/{sub}/comments/{id}/{title}/{comment_id}/", web::get().to(post::item))
})
.bind(address.clone())
.expect(format!("Cannot bind to the address: {}", address).as_str())
.bind(&address)
.unwrap_or_else(|_| panic!("Cannot bind to the address: {}", address))
.run()
.await
}

View File

@ -1,56 +0,0 @@
// CRATES
use actix_web::{get, web, HttpResponse, Result, http::StatusCode};
use askama::Template;
use crate::utils::{fetch_posts, ErrorTemplate, Params, Post};
// STRUCTS
#[derive(Template)]
#[template(path = "popular.html", escape = "none")]
struct PopularTemplate {
posts: Vec<Post>,
sort: String,
ends: (String, String),
}
// RENDER
async fn render(sub_name: String, sort: Option<String>, ends: (Option<String>, Option<String>)) -> Result<HttpResponse> {
let sorting = sort.unwrap_or("hot".to_string());
let before = ends.1.clone().unwrap_or(String::new()); // If there is an after, there must be a before
// Build the Reddit JSON API url
let url = match ends.0 {
Some(val) => format!("https://www.reddit.com/r/{}/{}.json?before={}&count=25", sub_name, sorting, val),
None => match ends.1 {
Some(val) => format!("https://www.reddit.com/r/{}/{}.json?after={}&count=25", sub_name, sorting, val),
None => format!("https://www.reddit.com/r/{}/{}.json", sub_name, sorting),
},
};
let items_result = fetch_posts(url, String::new()).await;
if items_result.is_err() {
let s = ErrorTemplate {
message: items_result.err().unwrap().to_string(),
}
.render()
.unwrap();
Ok(HttpResponse::Ok().status(StatusCode::NOT_FOUND).content_type("text/html").body(s))
} else {
let items = items_result.unwrap();
let s = PopularTemplate {
posts: items.0,
sort: sorting,
ends: (before, items.1),
}
.render()
.unwrap();
Ok(HttpResponse::Ok().content_type("text/html").body(s))
}
}
// SERVICES
#[get("/")]
pub async fn page(params: web::Query<Params>) -> Result<HttpResponse> {
render("popular".to_string(), params.sort.clone(), (params.before.clone(), params.after.clone())).await
}

View File

@ -1,9 +1,11 @@
// CRATES
use actix_web::{get, web, HttpResponse, Result, http::StatusCode};
use crate::utils::{error, format_num, format_url, param, request, rewrite_url, val, Comment, Flags, Flair, Post};
use actix_web::{HttpRequest, HttpResponse, Result};
use async_recursion::async_recursion;
use askama::Template;
use chrono::{TimeZone, Utc};
use pulldown_cmark::{html, Options, Parser};
use crate::utils::{request, val, Comment, ErrorTemplate, Flair, Params, Post};
// STRUCTS
#[derive(Template)]
@ -14,144 +16,135 @@ struct PostTemplate {
sort: String,
}
async fn render(id: String, sort: String) -> Result<HttpResponse> {
// Log the post ID being fetched
println!("id: {}", id);
pub async fn item(req: HttpRequest) -> HttpResponse {
let path = format!("{}.json?{}&raw_json=1", req.path(), req.query_string());
let sort = param(&path, "sort");
// Build the Reddit JSON API url
let url: String = format!("https://reddit.com/{}.json?sort={}", id, sort);
// Log the post ID being fetched in debug mode
#[cfg(debug_assertions)]
dbg!(req.match_info().get("id").unwrap_or(""));
// Send a request to the url, receive JSON in response
let req = request(url).await;
match request(&path).await {
// Otherwise, grab the JSON output from the request
Ok(res) => {
// Parse the JSON into Post and Comment structs
let post = parse_post(&res[0]).await.unwrap();
let comments = parse_comments(&res[1]).await.unwrap();
// If the Reddit API returns an error, exit and send error page to user
if req.is_err() {
let s = ErrorTemplate {
message: req.err().unwrap().to_string(),
// Use the Post and Comment structs to generate a website to show users
let s = PostTemplate { comments, post, sort }.render().unwrap();
HttpResponse::Ok().content_type("text/html").body(s)
}
.render()
.unwrap();
return Ok(HttpResponse::Ok().status(StatusCode::NOT_FOUND).content_type("text/html").body(s));
}
// Otherwise, grab the JSON output from the request
let res = req.unwrap();
// Parse the JSON into Post and Comment structs
let post = parse_post(res.clone()).await;
let comments = parse_comments(res).await;
// Use the Post and Comment structs to generate a website to show users
let s = PostTemplate {
comments: comments.unwrap(),
post: post.unwrap(),
sort: sort,
}
.render()
.unwrap();
Ok(HttpResponse::Ok().content_type("text/html").body(s))
}
// SERVICES
#[get("/{id}")]
async fn short(web::Path(id): web::Path<String>) -> Result<HttpResponse> {
render(id.to_string(), "confidence".to_string()).await
}
#[get("/r/{sub}/comments/{id}/{title}/")]
async fn page(web::Path((_sub, id)): web::Path<(String, String)>, params: web::Query<Params>) -> Result<HttpResponse> {
match &params.sort {
Some(sort) => render(id, sort.to_string()).await,
None => render(id, "confidence".to_string()).await,
// If the Reddit API returns an error, exit and send error page to user
Err(msg) => error(msg.to_string()).await,
}
}
// UTILITIES
async fn media(data: &serde_json::Value) -> String {
let post_hint: &str = data["data"]["post_hint"].as_str().unwrap_or("");
let has_media: bool = data["data"]["media"].is_object();
let prefix = if cfg!(feature = "proxy") { "/imageproxy/" } else { "" };
let media: String = if !has_media {
format!(r#"<h4 class="post_body"><a href="{u}">{u}</a></h4>"#, u = data["data"]["url"].as_str().unwrap())
async fn media(data: &serde_json::Value) -> (String, String) {
let post_type: &str;
let url = if !data["preview"]["reddit_video_preview"]["fallback_url"].is_null() {
post_type = "video";
format_url(data["preview"]["reddit_video_preview"]["fallback_url"].as_str().unwrap_or_default().to_string())
} else if !data["secure_media"]["reddit_video"]["fallback_url"].is_null() {
post_type = "video";
format_url(data["secure_media"]["reddit_video"]["fallback_url"].as_str().unwrap_or_default().to_string())
} else if data["post_hint"].as_str().unwrap_or("") == "image" {
post_type = "image";
format_url(data["preview"]["images"][0]["source"]["url"].as_str().unwrap_or_default().to_string())
} else {
format!(r#"<img class="post_image" src="{}{}.png"/>"#, prefix, data["data"]["url"].as_str().unwrap())
post_type = "link";
data["url"].as_str().unwrap_or_default().to_string()
};
match post_hint {
"hosted:video" => format!(
r#"<video class="post_image" src="{}{}" controls/>"#,
prefix, data["data"]["media"]["reddit_video"]["fallback_url"].as_str().unwrap()
),
"image" => format!(r#"<img class="post_image" src="{}{}"/>"#, prefix, data["data"]["url"].as_str().unwrap()),
"self" => String::from(""),
_ => media,
}
}
async fn markdown_to_html(md: &str) -> String {
let mut options = Options::empty();
options.insert(Options::ENABLE_TABLES);
options.insert(Options::ENABLE_FOOTNOTES);
options.insert(Options::ENABLE_STRIKETHROUGH);
options.insert(Options::ENABLE_TASKLISTS);
let parser = Parser::new_ext(md, options);
// Write to String buffer.
let mut html_output = String::new();
html::push_html(&mut html_output, parser);
html_output
(post_type.to_string(), url)
}
// POSTS
async fn parse_post(json: serde_json::Value) -> Result<Post, &'static str> {
let post_data: &serde_json::Value = &json[0]["data"]["children"][0];
async fn parse_post(json: &serde_json::Value) -> Result<Post, &'static str> {
// Retrieve post (as opposed to comments) from JSON
let post: &serde_json::Value = &json["data"]["children"][0];
let unix_time: i64 = post_data["data"]["created_utc"].as_f64().unwrap().round() as i64;
let score = post_data["data"]["score"].as_i64().unwrap();
// Grab UTC time as unix timestamp
let unix_time: i64 = post["data"]["created_utc"].as_f64().unwrap_or_default().round() as i64;
// Parse post score and upvote ratio
let score = post["data"]["score"].as_i64().unwrap_or_default();
let ratio: f64 = post["data"]["upvote_ratio"].as_f64().unwrap_or(1.0) * 100.0;
let post = Post {
title: val(post_data, "title").await,
community: val(post_data, "subreddit").await,
body: markdown_to_html(post_data["data"]["selftext"].as_str().unwrap()).await,
author: val(post_data, "author").await,
url: val(post_data, "permalink").await,
score: if score > 1000 { format!("{}k", score / 1000) } else { score.to_string() },
media: media(post_data).await,
time: Utc.timestamp(unix_time, 0).format("%b %e %Y %H:%M UTC").to_string(),
// Determine the type of media along with the media URL
let media = media(&post["data"]).await;
// Build a post using data parsed from Reddit post API
Ok(Post {
id: val(post, "id"),
title: val(post, "title"),
community: val(post, "subreddit"),
body: rewrite_url(&val(post, "selftext_html")),
author: val(post, "author"),
author_flair: Flair(
val(post, "author_flair_text"),
val(post, "author_flair_background_color"),
val(post, "author_flair_text_color"),
),
permalink: val(post, "permalink"),
score: format_num(score),
upvote_ratio: ratio as i64,
post_type: media.0,
flair: Flair(
val(post_data, "link_flair_text").await,
val(post_data, "link_flair_background_color").await,
if val(post_data, "link_flair_text_color").await == "dark" {
val(post, "link_flair_text"),
val(post, "link_flair_background_color"),
if val(post, "link_flair_text_color") == "dark" {
"black".to_string()
} else {
"white".to_string()
},
),
};
Ok(post)
flags: Flags {
nsfw: post["data"]["over_18"].as_bool().unwrap_or(false),
stickied: post["data"]["stickied"].as_bool().unwrap_or(false),
},
media: media.1,
time: Utc.timestamp(unix_time, 0).format("%b %e %Y %H:%M UTC").to_string(),
})
}
// COMMENTS
async fn parse_comments(json: serde_json::Value) -> Result<Vec<Comment>, &'static str> {
let comment_data = json[1]["data"]["children"].as_array().unwrap();
#[async_recursion]
async fn parse_comments(json: &serde_json::Value) -> Result<Vec<Comment>, &'static str> {
// Separate the comment JSON into a Vector of comments
let comment_data = json["data"]["children"].as_array().unwrap();
let mut comments: Vec<Comment> = Vec::new();
for comment in comment_data.iter() {
// For each comment, retrieve the values to build a Comment object
for comment in comment_data {
let unix_time: i64 = comment["data"]["created_utc"].as_f64().unwrap_or(0.0).round() as i64;
let score = comment["data"]["score"].as_i64().unwrap_or(0);
let body = markdown_to_html(comment["data"]["body"].as_str().unwrap_or("")).await;
if unix_time == 0 {
continue;
}
// println!("{}", body);
let score = comment["data"]["score"].as_i64().unwrap_or(0);
let body = rewrite_url(&val(comment, "body_html"));
let replies: Vec<Comment> = if comment["data"]["replies"].is_object() {
parse_comments(&comment["data"]["replies"]).await.unwrap_or_default()
} else {
Vec::new()
};
comments.push(Comment {
body: body,
author: val(comment, "author").await,
score: if score > 1000 { format!("{}k", score / 1000) } else { score.to_string() },
id: val(comment, "id"),
body,
author: val(comment, "author"),
score: format_num(score),
time: Utc.timestamp(unix_time, 0).format("%b %e %Y %H:%M UTC").to_string(),
replies,
flair: Flair(
val(comment, "author_flair_text"),
val(comment, "author_flair_background_color"),
val(comment, "author_flair_text_color"),
),
});
}

View File

@ -1,18 +1,46 @@
use actix_web::{get, web, HttpResponse, Result, client::Client, Error};
use actix_web::{client::Client, error, web, Error, HttpResponse, Result};
use url::Url;
#[get("/imageproxy/{url:.*}")]
async fn handler(web::Path(url): web::Path<String>) -> Result<HttpResponse> {
if cfg!(feature = "proxy") {
dbg!(&url);
let client = Client::default();
client.get(url)
.send()
.await
.map_err(Error::from)
.and_then(|res| {
Ok(HttpResponse::build(res.status()).streaming(res))
})
} else {
Ok(HttpResponse::Ok().body(""))
use base64::decode;
pub async fn handler(web::Path(b64): web::Path<String>) -> Result<HttpResponse> {
let domains = vec![
// THUMBNAILS
"a.thumbs.redditmedia.com",
"b.thumbs.redditmedia.com",
// ICONS
"styles.redditmedia.com",
"www.redditstatic.com",
// PREVIEWS
"preview.redd.it",
"external-preview.redd.it",
// MEDIA
"i.redd.it",
"v.redd.it",
];
match decode(b64) {
Ok(bytes) => {
let media = String::from_utf8(bytes).unwrap_or_default();
match Url::parse(media.as_str()) {
Ok(url) => {
let domain = url.domain().unwrap_or_default();
if domains.contains(&domain) {
Client::default()
.get(media.replace("&amp;", "&"))
.send()
.await
.map_err(Error::from)
.map(|res| HttpResponse::build(res.status()).streaming(res))
} else {
Err(error::ErrorForbidden("Resource must be from Reddit"))
}
}
Err(_) => Err(error::ErrorBadRequest("Can't parse encoded base64 URL")),
}
}
Err(_) => Err(error::ErrorBadRequest("Can't decode base64 URL")),
}
}
}

53
src/search.rs Normal file
View File

@ -0,0 +1,53 @@
// CRATES
use crate::utils::{error, fetch_posts, param, Post};
use actix_web::{HttpRequest, HttpResponse};
use askama::Template;
// STRUCTS
struct SearchParams {
q: String,
sort: String,
t: String,
before: String,
after: String,
restrict_sr: String,
}
#[derive(Template)]
#[template(path = "search.html", escape = "none")]
struct SearchTemplate {
posts: Vec<Post>,
sub: String,
params: SearchParams,
}
// SERVICES
pub async fn find(req: HttpRequest) -> HttpResponse {
let path = format!("{}.json?{}", req.path(), req.query_string());
let sort = if param(&path, "sort").is_empty() {
"relevance".to_string()
} else {
param(&path, "sort")
};
let sub = req.match_info().get("sub").unwrap_or("").to_string();
match fetch_posts(&path, String::new()).await {
Ok(posts) => HttpResponse::Ok().content_type("text/html").body(
SearchTemplate {
posts: posts.0,
sub,
params: SearchParams {
q: param(&path, "q"),
sort,
t: param(&path, "t"),
before: param(&path, "after"),
after: posts.1,
restrict_sr: param(&path, "restrict_sr"),
},
}
.render()
.unwrap(),
),
Err(msg) => error(msg.to_string()).await,
}
}

48
src/settings.rs Normal file
View File

@ -0,0 +1,48 @@
// // CRATES
// use crate::utils::cookies;
// use actix_web::{cookie::Cookie, web::Form, HttpRequest, HttpResponse, Result}; // http::Method,
// use askama::Template;
// // STRUCTS
// #[derive(Template)]
// #[template(path = "settings.html", escape = "none")]
// struct SettingsTemplate {
// pref_nsfw: String,
// }
// #[derive(serde::Deserialize)]
// pub struct Preferences {
// pref_nsfw: Option<String>,
// }
// // FUNCTIONS
// // Retrieve cookies from request "Cookie" header
// pub async fn get(req: HttpRequest) -> Result<HttpResponse> {
// let cookies = cookies(req);
// let pref_nsfw: String = cookies.get("pref_nsfw").unwrap_or(&String::new()).to_owned();
// let s = SettingsTemplate { pref_nsfw }.render().unwrap();
// Ok(HttpResponse::Ok().content_type("text/html").body(s))
// }
// // Set cookies using response "Set-Cookie" header
// pub async fn set(form: Form<Preferences>) -> HttpResponse {
// let nsfw: Cookie = match &form.pref_nsfw {
// Some(value) => Cookie::build("pref_nsfw", value).path("/").secure(true).http_only(true).finish(),
// None => Cookie::build("pref_nsfw", "").finish(),
// };
// let body = SettingsTemplate {
// pref_nsfw: form.pref_nsfw.clone().unwrap_or_default(),
// }
// .render()
// .unwrap();
// HttpResponse::Found()
// .content_type("text/html")
// .set_header("Set-Cookie", nsfw.to_string())
// .set_header("Location", "/settings")
// .body(body)
// }

View File

@ -1,7 +1,7 @@
// CRATES
use actix_web::{get, web, HttpResponse, Result, http::StatusCode};
use crate::utils::{error, fetch_posts, format_num, format_url, param, request, rewrite_url, val, Post, Subreddit};
use actix_web::{HttpRequest, HttpResponse, Result};
use askama::Template;
use crate::utils::{request, val, fetch_posts, ErrorTemplate, Params, Post, Subreddit};
// STRUCTS
#[derive(Template)]
@ -9,89 +9,97 @@ use crate::utils::{request, val, fetch_posts, ErrorTemplate, Params, Post, Subre
struct SubredditTemplate {
sub: Subreddit,
posts: Vec<Post>,
sort: String,
ends: (String, String)
sort: (String, String),
ends: (String, String),
}
#[derive(Template)]
#[template(path = "wiki.html", escape = "none")]
struct WikiTemplate {
sub: String,
wiki: String,
page: String,
}
// SERVICES
#[allow(dead_code)]
#[get("/r/{sub}")]
async fn page(web::Path(sub): web::Path<String>, params: web::Query<Params>) -> Result<HttpResponse> {
render(sub, params.sort.clone(), (params.before.clone(), params.after.clone())).await
}
pub async fn page(req: HttpRequest) -> HttpResponse {
let path = format!("{}.json?{}", req.path(), req.query_string());
let sub = req.match_info().get("sub").unwrap_or("popular").to_string();
let sort = req.match_info().get("sort").unwrap_or("hot").to_string();
pub async fn render(sub_name: String, sort: Option<String>, ends: (Option<String>, Option<String>)) -> Result<HttpResponse> {
let sorting = sort.unwrap_or("hot".to_string());
let before = ends.1.clone().unwrap_or(String::new()); // If there is an after, there must be a before
// Build the Reddit JSON API url
let url = match ends.0 {
Some(val) => format!("https://www.reddit.com/r/{}/{}.json?before={}&count=25", sub_name, sorting, val),
None => match ends.1 {
Some(val) => format!("https://www.reddit.com/r/{}/{}.json?after={}&count=25", sub_name, sorting, val),
None => format!("https://www.reddit.com/r/{}/{}.json", sub_name, sorting),
},
let sub_result = if !&sub.contains('+') && sub != "popular" {
subreddit(&sub).await.unwrap_or_default()
} else {
Subreddit::default()
};
let sub_result = subreddit(&sub_name).await;
let items_result = fetch_posts(url, String::new()).await;
if sub_result.is_err() || items_result.is_err() {
let s = ErrorTemplate {
message: sub_result.err().unwrap().to_string(),
match fetch_posts(&path, String::new()).await {
Ok(items) => {
let s = SubredditTemplate {
sub: sub_result,
posts: items.0,
sort: (sort, param(&path, "t")),
ends: (param(&path, "after"), items.1),
}
.render()
.unwrap();
HttpResponse::Ok().content_type("text/html").body(s)
}
.render()
.unwrap();
Ok(HttpResponse::Ok().status(StatusCode::NOT_FOUND).content_type("text/html").body(s))
} else {
let mut sub = sub_result.unwrap();
let items = items_result.unwrap();
Err(msg) => error(msg.to_string()).await,
}
}
sub.icon = if sub.icon != "" {
format!(r#"<img class="subreddit_icon" src="{}">"#, sub.icon)
} else {
String::new()
};
pub async fn wiki(req: HttpRequest) -> HttpResponse {
let sub = req.match_info().get("sub").unwrap_or("reddit.com");
let page = req.match_info().get("page").unwrap_or("index");
let path: String = format!("r/{}/wiki/{}.json?raw_json=1", sub, page);
let s = SubredditTemplate {
sub: sub,
posts: items.0,
sort: sorting,
ends: (before, items.1)
match request(&path).await {
Ok(res) => {
let s = WikiTemplate {
sub: sub.to_string(),
wiki: rewrite_url(res["data"]["content_html"].as_str().unwrap_or_default()),
page: page.to_string(),
}
.render()
.unwrap();
HttpResponse::Ok().content_type("text/html").body(s)
}
.render()
.unwrap();
Ok(HttpResponse::Ok().content_type("text/html").body(s))
Err(msg) => error(msg.to_string()).await,
}
}
// SUBREDDIT
async fn subreddit(sub: &String) -> Result<Subreddit, &'static str> {
async fn subreddit(sub: &str) -> Result<Subreddit, &'static str> {
// Build the Reddit JSON API url
let url: String = format!("https://www.reddit.com/r/{}/about.json", sub);
let path: String = format!("r/{}/about.json?raw_json=1", sub);
// Send a request to the url, receive JSON in response
let req = request(url).await;
// Send a request to the url
match request(&path).await {
// If success, receive JSON in response
Ok(res) => {
// Metadata regarding the subreddit
let members: i64 = res["data"]["subscribers"].as_u64().unwrap_or_default() as i64;
let active: i64 = res["data"]["accounts_active"].as_u64().unwrap_or_default() as i64;
// If the Reddit API returns an error, exit this function
if req.is_err() {
return Err(req.err().unwrap());
// Fetch subreddit icon either from the community_icon or icon_img value
let community_icon: &str = res["data"]["community_icon"].as_str().unwrap_or("").split('?').collect::<Vec<&str>>()[0];
let icon = if community_icon.is_empty() { val(&res, "icon_img") } else { community_icon.to_string() };
let sub = Subreddit {
name: val(&res, "display_name"),
title: val(&res, "title"),
description: val(&res, "public_description"),
info: rewrite_url(&val(&res, "description_html").replace("\\", "")),
icon: format_url(icon),
members: format_num(members),
active: format_num(active),
wiki: res["data"]["wiki_enabled"].as_bool().unwrap_or_default(),
};
Ok(sub)
}
// If the Reddit API returns an error, exit this function
Err(msg) => return Err(msg),
}
// Otherwise, grab the JSON output from the request
let res = req.unwrap();
let members = res["data"]["subscribers"].as_u64().unwrap_or(0);
let active = res["data"]["accounts_active"].as_u64().unwrap_or(0);
let sub = Subreddit {
name: val(&res, "display_name").await,
title: val(&res, "title").await,
description: val(&res, "public_description").await,
icon: val(&res, "icon_img").await,
members: if members > 1000 { format!("{}k", members / 1000) } else { members.to_string() },
active: if active > 1000 { format!("{}k", active / 1000) } else { active.to_string() },
};
Ok(sub)
}
}

View File

@ -1,7 +1,8 @@
// CRATES
use actix_web::{get, web, HttpResponse, Result, http::StatusCode};
use crate::utils::{error, fetch_posts, format_url, nested_val, param, request, Post, User};
use actix_web::{HttpRequest, HttpResponse, Result};
use askama::Template;
use crate::utils::{nested_val, request, fetch_posts, ErrorTemplate, Params, Post, User};
use chrono::{TimeZone, Utc};
// STRUCTS
#[derive(Template)]
@ -9,66 +10,68 @@ use crate::utils::{nested_val, request, fetch_posts, ErrorTemplate, Params, Post
struct UserTemplate {
user: User,
posts: Vec<Post>,
sort: String,
sort: (String, String),
ends: (String, String),
}
async fn render(username: String, sort: String) -> Result<HttpResponse> {
// Build the Reddit JSON API url
let url: String = format!("https://www.reddit.com/user/{}/.json?sort={}", username, sort);
// FUNCTIONS
pub async fn profile(req: HttpRequest) -> HttpResponse {
// Build the Reddit JSON API path
let path = format!("{}.json?{}&raw_json=1", req.path(), req.query_string());
// Retrieve other variables from Libreddit request
let sort = param(&path, "sort");
let username = req.match_info().get("username").unwrap_or("").to_string();
// Request user profile data and user posts/comments from Reddit
let user = user(&username).await;
let posts = fetch_posts(url, "Comment".to_string()).await;
let posts = fetch_posts(&path, "Comment".to_string()).await;
if user.is_err() || posts.is_err() {
let s = ErrorTemplate {
message: user.err().unwrap().to_string(),
match posts {
Ok(items) => {
let s = UserTemplate {
user: user.unwrap(),
posts: items.0,
sort: (sort, param(&path, "t")),
ends: (param(&path, "after"), items.1),
}
.render()
.unwrap();
HttpResponse::Ok().content_type("text/html").body(s)
}
.render()
.unwrap();
Ok(HttpResponse::Ok().status(StatusCode::NOT_FOUND).content_type("text/html").body(s))
} else {
let s = UserTemplate {
user: user.unwrap(),
posts: posts.unwrap().0,
sort: sort,
}
.render()
.unwrap();
Ok(HttpResponse::Ok().content_type("text/html").body(s))
}
}
// SERVICES
#[get("/u/{username}")]
async fn page(web::Path(username): web::Path<String>, params: web::Query<Params>) -> Result<HttpResponse> {
match &params.sort {
Some(sort) => render(username, sort.to_string()).await,
None => render(username, "hot".to_string()).await,
// If there is an error show error page
Err(msg) => error(msg.to_string()).await,
}
}
// USER
async fn user(name: &String) -> Result<User, &'static str> {
// Build the Reddit JSON API url
let url: String = format!("https://www.reddit.com/user/{}/about.json", name);
async fn user(name: &str) -> Result<User, &'static str> {
// Build the Reddit JSON API path
let path: String = format!("user/{}/about.json", name);
// Send a request to the url, receive JSON in response
let req = request(url).await;
let res;
// If the Reddit API returns an error, exit this function
if req.is_err() {
return Err(req.err().unwrap());
// Send a request to the url
match request(&path).await {
// If success, receive JSON in response
Ok(response) => {
res = response;
}
// If the Reddit API returns an error, exit this function
Err(msg) => return Err(msg),
}
// Otherwise, grab the JSON output from the request
let res = req.unwrap();
// Grab creation date as unix timestamp
let created: i64 = res["data"]["created"].as_f64().unwrap_or(0.0).round() as i64;
// Parse the JSON output into a User struct
Ok(User {
name: name.to_string(),
icon: nested_val(&res, "subreddit", "icon_img").await,
karma: res["data"]["total_karma"].as_i64().unwrap(),
banner: nested_val(&res, "subreddit", "banner_img").await,
description: nested_val(&res, "subreddit", "public_description").await,
title: nested_val(&res, "subreddit", "title"),
icon: format_url(nested_val(&res, "subreddit", "icon_img")),
karma: res["data"]["total_karma"].as_i64().unwrap_or(0),
created: Utc.timestamp(created, 0).format("%b %e, %Y").to_string(),
banner: nested_val(&res, "subreddit", "banner_img"),
description: nested_val(&res, "subreddit", "public_description"),
})
}
}

View File

@ -1,189 +1,276 @@
// use std::collections::HashMap;
//
// CRATES
//
use actix_web::{HttpResponse, Result};
use askama::Template;
use base64::encode;
use chrono::{TimeZone, Utc};
use surf::{get, client, middleware::Redirect};
use serde_json::{Value, from_str};
use regex::Regex;
use serde_json::from_str;
use url::Url;
// use surf::{client, get, middleware::Redirect};
//
// STRUCTS
//
#[allow(dead_code)]
// Post flair with text, background color and foreground color
pub struct Flair(pub String, pub String, pub String);
// Post flags with nsfw and stickied
pub struct Flags {
pub nsfw: bool,
pub stickied: bool,
}
#[allow(dead_code)]
// Post containing content, metadata and media
pub struct Post {
pub id: String,
pub title: String,
pub community: String,
pub body: String,
pub author: String,
pub url: String,
pub author_flair: Flair,
pub permalink: String,
pub score: String,
pub upvote_ratio: i64,
pub post_type: String,
pub flair: Flair,
pub flags: Flags,
pub media: String,
pub time: String,
pub flair: Flair
}
#[allow(dead_code)]
// Comment with content, post, score and data/time that it was posted
pub struct Comment {
pub id: String,
pub body: String,
pub author: String,
pub flair: Flair,
pub score: String,
pub time: String
pub time: String,
pub replies: Vec<Comment>,
}
#[allow(dead_code)]
// User struct containing metadata about user
pub struct User {
pub name: String,
pub title: String,
pub icon: String,
pub karma: i64,
pub created: String,
pub banner: String,
pub description: String
pub description: String,
}
#[allow(dead_code)]
#[derive(Default)]
// Subreddit struct containing metadata about community
pub struct Subreddit {
pub name: String,
pub title: String,
pub description: String,
pub info: String,
pub icon: String,
pub members: String,
pub active: String
pub active: String,
pub wiki: bool,
}
// Parser for query params, used in sorting (eg. /r/rust/?sort=hot)
#[derive(serde::Deserialize)]
pub struct Params {
pub t: Option<String>,
pub q: Option<String>,
pub sort: Option<String>,
pub after: Option<String>,
pub before: Option<String>
pub before: Option<String>,
}
// Error template
#[derive(askama::Template)]
#[derive(Template)]
#[template(path = "error.html", escape = "none")]
pub struct ErrorTemplate {
pub message: String
pub message: String,
}
//
// FORMATTING
//
// Grab a query param from a url
pub fn param(path: &str, value: &str) -> String {
let url = Url::parse(format!("https://libredd.it/{}", path).as_str()).unwrap();
let pairs: std::collections::HashMap<_, _> = url.query_pairs().into_owned().collect();
pairs.get(value).unwrap_or(&String::new()).to_owned()
}
// Cookies from request
// pub fn cookies(req: HttpRequest) -> HashMap<String, String> {
// let mut result: HashMap<String, String> = HashMap::new();
// let cookies: Vec<Cookie> = req
// .headers()
// .get_all("Cookie")
// .map(|value| value.to_str().unwrap())
// .map(|unparsed| Cookie::parse(unparsed).unwrap())
// .collect();
// for cookie in cookies {
// result.insert(cookie.name().to_string(), cookie.value().to_string());
// }
// result
// }
// Direct urls to proxy if proxy is enabled
pub fn format_url(url: String) -> String {
if url.is_empty() || url == "self" || url == "default" {
String::new()
} else {
format!("/proxy/{}", encode(url).as_str())
}
}
// Rewrite Reddit links to Libreddit in body of text
pub fn rewrite_url(text: &str) -> String {
let re = Regex::new(r#"href="(https://|http://|)(www.|)(reddit).(com)/"#).unwrap();
re.replace_all(text, r#"href="/"#).to_string()
}
// Append `m` and `k` for millions and thousands respectively
pub fn format_num(num: i64) -> String {
if num > 1000000 {
format!("{}m", num / 1000000)
} else if num > 1000 {
format!("{}k", num / 1000)
} else {
num.to_string()
}
}
//
// JSON PARSING
//
#[allow(dead_code)]
// val() function used to parse JSON from Reddit APIs
pub async fn val(j: &serde_json::Value, k: &str) -> String {
String::from(j["data"][k].as_str().unwrap_or(""))
pub fn val(j: &serde_json::Value, k: &str) -> String {
String::from(j["data"][k].as_str().unwrap_or_default())
}
#[allow(dead_code)]
// nested_val() function used to parse JSON from Reddit APIs
pub async fn nested_val(j: &serde_json::Value, n: &str, k: &str) -> String {
String::from(j["data"][n][k].as_str().unwrap())
pub fn nested_val(j: &serde_json::Value, n: &str, k: &str) -> String {
String::from(j["data"][n][k].as_str().unwrap_or_default())
}
#[allow(dead_code)]
pub async fn fetch_posts(url: String, fallback_title: String) -> Result<(Vec<Post>, String), &'static str> {
// Send a request to the url, receive JSON in response
let req = request(url).await;
// Fetch posts of a user or subreddit
pub async fn fetch_posts(path: &str, fallback_title: String) -> Result<(Vec<Post>, String), &'static str> {
let res;
let post_list;
// If the Reddit API returns an error, exit this function
if req.is_err() {
return Err(req.err().unwrap());
// Send a request to the url
match request(&path).await {
// If success, receive JSON in response
Ok(response) => {
res = response;
}
// If the Reddit API returns an error, exit this function
Err(msg) => return Err(msg),
}
// Otherwise, grab the JSON output from the request
let res = req.unwrap();
// Fetch the list of posts from the JSON response
let post_list = res["data"]["children"].as_array().unwrap();
match res["data"]["children"].as_array() {
Some(list) => post_list = list,
None => return Err("No posts found"),
}
let mut posts: Vec<Post> = Vec::new();
for post in post_list.iter() {
let img = if val(post, "thumbnail").await.starts_with("https:/") {
val(post, "thumbnail").await
} else {
String::new()
};
let unix_time: i64 = post["data"]["created_utc"].as_f64().unwrap().round() as i64;
let score = post["data"]["score"].as_i64().unwrap();
let title = val(post, "title").await;
// For each post from posts list
for post in post_list {
let img = format_url(val(post, "thumbnail"));
let unix_time: i64 = post["data"]["created_utc"].as_f64().unwrap_or_default().round() as i64;
let score = post["data"]["score"].as_i64().unwrap_or_default();
let ratio: f64 = post["data"]["upvote_ratio"].as_f64().unwrap_or(1.0) * 100.0;
let title = val(post, "title");
posts.push(Post {
id: val(post, "id"),
title: if title.is_empty() { fallback_title.to_owned() } else { title },
community: val(post, "subreddit").await,
body: val(post, "body").await,
author: val(post, "author").await,
score: if score > 1000 { format!("{}k", score / 1000) } else { score.to_string() },
community: val(post, "subreddit"),
body: rewrite_url(&val(post, "body_html")),
author: val(post, "author"),
author_flair: Flair(
val(post, "author_flair_text"),
val(post, "author_flair_background_color"),
val(post, "author_flair_text_color"),
),
score: format_num(score),
upvote_ratio: ratio as i64,
post_type: "link".to_string(),
media: img,
url: val(post, "permalink").await,
time: Utc.timestamp(unix_time, 0).format("%b %e '%y").to_string(),
flair: Flair(
val(post, "link_flair_text").await,
val(post, "link_flair_background_color").await,
if val(post, "link_flair_text_color").await == "dark" {
val(post, "link_flair_text"),
val(post, "link_flair_background_color"),
if val(post, "link_flair_text_color") == "dark" {
"black".to_string()
} else {
"white".to_string()
},
),
flags: Flags {
nsfw: post["data"]["over_18"].as_bool().unwrap_or_default(),
stickied: post["data"]["stickied"].as_bool().unwrap_or_default(),
},
permalink: val(post, "permalink"),
time: Utc.timestamp(unix_time, 0).format("%b %e '%y").to_string(),
});
}
Ok((posts, res["data"]["after"].as_str().unwrap_or("").to_string()))
Ok((posts, res["data"]["after"].as_str().unwrap_or_default().to_string()))
}
//
// NETWORKING
//
pub async fn error(msg: String) -> HttpResponse {
let body = ErrorTemplate { message: msg }.render().unwrap_or_default();
HttpResponse::NotFound().content_type("text/html").body(body)
}
// Make a request to a Reddit API and parse the JSON response
#[allow(dead_code)]
pub async fn request(url: String) -> Result<serde_json::Value, &'static str> {
// --- actix-web::client ---
// let client = actix_web::client::Client::default();
// let res = client
// .get(url)
// .send()
// .await?
// .body()
// .limit(1000000)
// .await?;
pub async fn request(path: &str) -> Result<serde_json::Value, &'static str> {
let url = format!("https://www.reddit.com/{}", path);
// let body = std::str::from_utf8(res.as_ref())?; // .as_ref converts Bytes to [u8]
// --- surf ---
let req = get(&url).header("User-Agent", "libreddit");
let client = client().with(Redirect::new(5));
let mut res = client.send(req).await.unwrap();
let success = res.status().is_success();
let body = res.body_string().await.unwrap();
dbg!(url.clone());
// --- reqwest ---
// let res = reqwest::get(&url).await.unwrap();
// // Read the status from the response
// let success = res.status().is_success();
// // Read the body of the response
// let body = res.text().await.unwrap();
// Parse the response from Reddit as JSON
let json: Value = from_str(body.as_str()).unwrap_or(Value::Null);
if !success {
println!("! {} - {}", url, "Page not found");
Err("Page not found")
} else if json == Value::Null {
println!("! {} - {}", url, "Failed to parse page JSON data");
Err("Failed to parse page JSON data")
} else {
Ok(json)
// Send request using reqwest
match reqwest::get(&url).await {
Ok(res) => {
// Read the status from the response
match res.status().is_success() {
true => {
// Parse the response from Reddit as JSON
match from_str(res.text().await.unwrap_or_default().as_str()) {
Ok(json) => Ok(json),
Err(_) => {
#[cfg(debug_assertions)]
dbg!(format!("{} - Failed to parse page JSON data", url));
Err("Failed to parse page JSON data")
}
}
}
// If Reddit returns error, tell user Page Not Found
false => {
#[cfg(debug_assertions)]
dbg!(format!("{} - Page not found", url));
Err("Page not found")
}
}
}
// If can't send request to Reddit, return this to user
Err(e) => {
#[cfg(debug_assertions)]
dbg!(format!("{} - {}", url, e));
Err("Couldn't send request to Reddit")
}
}
}

View File

@ -1,2 +1,2 @@
User-Agent: *
Disallow: /
User-agent: *
Allow: /

View File

@ -1,36 +1,57 @@
/* General */
:root {
--accent: aqua;
--background: #0F0F0F;
--foreground: #222;
--outside: #1F1F1F;
--post: #161616;
--highlighted: #333;
--black-contrast: 0 1px 3px rgba(0,0,0,0.5);
}
::selection {
color: var(--background);
background: var(--accent);
}
* {
transition: 0.2s all;
margin: 0px;
margin: 0;
color: white;
font-family: sans-serif;
font-weight: normal;
}
html {
background: black;
body {
background: var(--background);
font-size: 15px;
}
header {
nav {
display: flex;
justify-content: space-between;
color: aqua;
background: #151515;
padding: 15px;
font-weight: bold;
align-items: center;
color: var(--accent);
background: var(--outside);
padding: 5px 15px;
font-size: 20px;
min-height: 40px;
}
#lib, #github {
color: white;
}
nav #lib, nav #github, nav #version { color: white; }
nav #version { opacity: 25%; }
main {
display: flex;
justify-content: center;
max-width: 1000px;
padding: 10px 20px;
margin: 20px auto;
}
#column_one {
max-width: 750px;
margin: 0 auto;
margin-top: 25px;
padding: 0px 10px;
border-radius: 5px;
overflow: hidden;
}
footer {
@ -38,186 +59,339 @@ footer {
justify-content: center;
}
footer > a {
margin-right: 5px;
}
button {
background: none;
border: none;
font-weight: bold;
}
hr {
margin: 20px 0;
}
a {
color: inherit;
text-decoration: none;
transition: 0.2s all;
}
a:not(.post_right):hover {
text-decoration: underline;
}
span {
color: aqua;
img[src=""] {
display: none;
}
#about {
background: #151515;
aside {
flex-grow: 1;
margin: 20px 20px 0 10px;
max-width: 350px;
}
/* Subreddit */
.panel {
border: 1px solid var(--highlighted);
}
.subreddit {
max-width: 750px;
margin: 0 auto;
.dot {
font-size: 12px;
opacity: 0.5;
}
/* User & Subreddit */
#user, #subreddit, #sidebar {
margin: 40px auto 0 auto;
display: flex;
padding-bottom: 25px;
}
.subreddit_name {
margin-bottom: 10px;
}
.subreddit_right {
display: flex;
flex-flow: column;
justify-content: center;
}
.subreddit_icon {
width: 100px;
height: 100px;
border-radius: 100%;
padding: 20px;
}
#stats {
margin-top: 10px;
}
/* User */
.user {
max-width: 750px;
margin: 0 auto;
display: flex;
}
.user_right {
display: flex;
flex-flow: column;
justify-content: center;
}
.user_icon {
width: 100px;
height: 100px;
border-radius: 100%;
padding: 20px;
}
/* Sorting */
#sort {
max-width: 750px;
margin: 20px -10px;
display: flex;
justify-content: start;
padding: 0px 10px;
}
#sort > div, footer > a {
background: #151515;
color: lightgrey;
flex-direction: column;
align-items: center;
height: max-content;
background: var(--outside);
border-radius: 5px;
margin-right: 5px;
overflow: hidden;
}
#user *, #subreddit * { text-align: center; }
#user, #sub_meta, #sidebar_contents { padding: 20px; }
#sidebar, #sidebar_contents { margin-top: 10px; }
#sidebar_label { padding: 10px; }
#user_icon, #sub_icon {
width: 100px;
height: 100px;
border: 2px solid var(--accent);
border-radius: 100%;
padding: 10px;
margin: 10px;
}
#user_title, #sub_title {
margin: 0 20px;
font-size: 20px;
font-weight: bold;
}
#user_description, #sub_description {
margin: 0 20px;
}
#user_name, #user_description:not(:empty), #user_icon
#sub_name, #sub_icon, #sub_description:not(:empty) {
margin-bottom: 20px;
}
#user_details, #sub_details {
display: grid;
grid-template-columns: repeat(2, 1fr);
grid-column-gap: 20px;
}
#user_details > label, #sub_details > label {
color: var(--accent);
}
/* Wiki Pages */
#wiki {
background: var(--foreground);
padding: 35px;
}
#top {
background: var(--highlighted);
width: 100%;
display: flex;
}
#top > * {
flex-grow: 1;
text-align: center;
height: 35px;
line-height: 35px;
}
#top > div {
border-bottom: 2px solid white;
}
/* Sorting and Search */
select {
background: var(--outside);
transition: 0.2s all;
}
select, #search {
border: none;
padding: 0 15px;
height: 40px;
appearance: none;
border-radius: 5px 0px 0px 5px;
}
#searchbox {
display: flex;
box-shadow: var(--black-contrast);
}
#searchbox > *, #sort_submit {
background: var(--highlighted);
height: 40px;
}
#search {
border-right: 2px var(--outside) solid;
min-width: 0;
flex-grow: 1;
}
#inside {
display: flex;
align-items: center;
border-right: 2px var(--outside) solid;
height: 40px;
padding: 0 10px;
}
#restrict_sr { margin-right: 5px; }
input[type="submit"] {
border: 0;
border-radius: 0px 5px 5px 0px;
transition: 0.2s all;
}
select:hover { background: var(--foreground); }
input[type="submit"]:hover { color: var(--accent); }
#timeframe {
margin: 0 2px;
border-radius: 0;
}
#sort_options + #timeframe:not(#search_sort > #timeframe) {
margin-left: 10px;
border-radius: 5px 0px 0px 5px;
}
#search_sort {
background: var(--highlighted);
border-radius: 5px;
overflow: auto;
}
#search_sort > #search {
border: 0;
background: transparent;
}
#search_sort > *, #searchbox > * { font-size: 15px; }
#search_sort > :not(:first-child), #search_sort > #sort_options {
margin: 0;
border-radius: 0;
border-right: 0;
border-left: 2px solid var(--background);
box-shadow: none;
background: transparent;
}
#sort_options {
height: 40px;
}
#sort, #search_sort {
display: flex;
align-items: center;
margin-bottom: 20px;
}
#sort_options, footer > a {
border-radius: 5px;
box-shadow: var(--black-contrast);
background: var(--outside);
display: flex;
overflow: auto;
}
#sort_options > a, footer > a {
color: lightgrey;
padding: 10px 20px;
text-align: center;
cursor: pointer;
transition: 0.2s all;
}
#sort > div.selected {
background: aqua;
#sort_options > a.selected {
background: var(--accent);
color: black;
}
#sort > div:hover {
background: #222;
#sort_options > a:not(.selected):hover {
background: var(--foreground);
}
/* Post */
.post {
border-radius: 5px;
background: #151515;
background: var(--post);
box-shadow: var(--black-contrast);
display: flex;
transition: 0.2s all;
}
.post:not(:last-child) { margin-bottom: 10px; }
.post.highlighted {
border: 2px solid #555;
background: #222;
margin: 20px 0;
}
.post.highlighted > .post_left {
background: #333;
.post.highlighted > .post_right {
flex-direction: column;
}
.post:hover {
background: #222;
background: var(--foreground);
}
.post:hover > .post_left {
background: #333;
background: var(--highlighted);
}
.post_left, .post_right {
display: flex;
flex-direction: column;
overflow-wrap: anywhere;
overflow-wrap: break-word;
}
.post_left {
text-align: center;
background: #222;
border-radius: 5px 0px 0px 5px;
background: var(--foreground);
border-radius: 5px 0 0 5px;
flex-direction: column;
min-width: 50px;
padding: 5px;
transition: 0.2s all;
}
.post_right > p > span, .comment_right > p > span {
float: right;
.post_score {
margin-top: 20px;
color: var(--accent);
}
.post_title {
font-size: 20px;
#post_footer {
display: flex;
justify-content: space-between;
opacity: 0.5;
font-size: 14px;
}
#post_links {
display: flex;
list-style: none;
padding: 0;
font-weight: bold;
}
#post_links > li {
margin-right: 15px;
}
.post_subreddit {
font-weight: bold;
}
.post_score {
margin-top: 1em;
color: aqua;
.post_title {
font-size: 16px;
line-height: 1.5;
margin-top: 10px;
}
.post_text {
padding: 15px;
display: flex;
flex-direction: column;
}
.post_right {
padding: 20px 25px;
flex-grow: 1;
flex-shrink: 1;
justify-content: space-between;
}
.post_right > * {
margin: 5px;
}
.post_right > p {
opacity: 0.75;
}
.post_image {
max-width: 500px;
.post_media {
max-width: 90%;
align-self: center;
}
.post_image[src=""] {
display: none;
margin-top: 15px;
}
.post_body {
@ -226,46 +400,50 @@ span {
margin: 10px 5px;
}
.post_body > p:not(:first-child) {
margin-top: 1.5em;
}
.post_body a {
text-decoration: underline;
color: aqua;
#post_url {
color: var(--accent);
margin-top: 10px;
}
.post_thumbnail {
object-fit: cover;
width: auto;
flex-shrink: 0;
padding: 10px;
border-radius: 15px;
border-radius: 5px;
border: 1px solid var(--foreground);
max-width: 20%;
}
.post_thumbnail[src=""] {
border: none;
.post_flair {
background: var(--accent);
color: black;
padding: 5px;
border-radius: 5px;
font-size: 12px;
font-weight: bold;
}
small {
background: aqua;
color: black;
padding: 5px;
border-radius: 5px;
font-size: 12px;
font-weight: bold;
.nsfw {
color: #FF5C5D;
margin-top: 20px;
border: 1px solid #FF5C5D;
padding: 5px;
font-size: 12px;
border-radius: 5px;
font-weight: bold;
}
.stickied {
--accent: #5cff85;
border: 1px solid #5cff85;
}
/* Comment */
.comment {
margin: 10px 0;
border-radius: 5px;
display: flex;
border: 2px solid #222;
}
.comment:hover {
background: #111;
}
.comment_left, .comment_right {
@ -276,20 +454,27 @@ small {
.comment_left {
text-align: center;
min-width: 50px;
padding: 5px 0;
align-items: center;
}
.comment_title { font-size: 20px; }
.comment_link { text-decoration: underline; }
.comment_author { opacity: 0.9; }
.comment_author.op {
color: var(--accent);
font-weight: bold;
}
.author_flair {
background: var(--highlighted);
color: white;
padding: 5px;
align-items: flex-end;
}
.comment_title {
font-size: 20px;
}
.comment_upvote {
margin-top: 0.5em;
border-radius: 5px 5px 0px 0px;
background: #222;
width: 40px;
padding: 10px 0px 0px 0px;
margin-right: 5px;
border-radius: 5px;
font-size: 12px;
font-weight: bold;
}
.comment_subreddit {
@ -297,26 +482,23 @@ small {
}
.comment_score {
color: aqua;
background: #222;
width: 40px;
padding: 5px 0px 10px 0px;
border-radius: 0px 0px 5px 5px;
color: var(--accent);
background: var(--foreground);
min-width: 40px;
border-radius: 5px;
padding: 10px 0;
font-size: 16px;
}
.comment_right {
word-wrap: anywhere;
padding: 10px 25px 10px 10px;
padding: 10px 25px 10px 5px;
flex-grow: 1;
flex-shrink: 1;
}
.comment_right > * {
margin: 5px;
}
.comment_right > p {
opacity: 0.75;
.comment_data > * {
margin-right: 5px;
}
.comment_image {
@ -324,10 +506,6 @@ small {
align-self: center;
}
.comment_image[src=""] {
display: none;
}
.comment_body {
opacity: 0.9;
font-weight: normal;
@ -335,31 +513,159 @@ small {
}
.comment_body > p:not(:first-child) {
margin-top: 1.5em;
margin-top: 20px;
}
.comment_body a {
text-decoration: underline;
color: aqua;
color: var(--accent);
}
.deeper_replies {
color: var(--accent);
margin-left: 15px;
}
::marker {
color: var(--accent);
}
.replies > .comment {
margin-left: -20px;
padding: 5px;
}
.datetime {
opacity: 0.5;
}
.line {
width: 2px;
height: 100%;
background: var(--foreground);
}
.post.comment {
background: #000;
border: 2px solid #222;
border: 2px solid var(--foreground);
}
.post.comment > .post_left {
background: black;
}
/* Markdown */
.md > *:not(:first-child) {
margin-top: 20px;
}
.md h1 { font-size: 22px; }
.md h2 { font-size: 20px; }
.md h3 { font-size: 18px; }
.md h4 { font-size: 16px; }
.md h5 { font-size: 14px; }
.md h6 { font-size: 12px; }
.md blockquote {
padding-left: 6px;
margin: 4px 0 4px 5px;
border-left: 4px solid var(--highlighted);
}
.md a {
color: var(--accent);
}
.md li { margin: 10px 0; }
.toc_child { list-style: none; }
.md pre {
background: var(--outside);
padding: 20px;
margin-top: 10px;
border-radius: 5px;
box-shadow: var(--black-contrast);
}
.md table {
margin: 5px;
}
.md code {
font-family: monospace;
font-size: 14px;
}
.md code:not(.md pre > code) { background: var(--highlighted); }
/* Tables */
table {
border: 3px #333 solid;
border-spacing: 0rem;
border: 3px var(--highlighted) solid;
border-spacing: 0;
}
td, th {
border: 1px #333 solid;
padding: 0.5em;
}
border: 1px var(--highlighted) solid;
padding: 10px;
}
/* Mobile */
@media screen and (max-width: 480px) {
.post {
flex-direction: column-reverse;
}
.post_header {
font-size: 14px;
}
.post_left {
border-radius: 0 0 5px 5px;
flex-direction: row;
justify-content: center;
align-items: center;
}
.nsfw {
margin: 5px 0px 5px 10px;
}
.post_score {
margin: 5px 0;
}
.replies > .comment {
margin-left: -25px;
padding: 5px 0;
}
.datetime {
width: 100%;
}
}
@media screen and (max-width: 800px) {
main {
flex-direction: column-reverse;
padding: 10px;
margin: 10px 0;
}
nav {
flex-direction: column;
padding: 10px;
}
aside, #subreddit, #user {
margin: 0;
max-width: 100%;
}
#user, #sidebar { margin: 20px 0; }
#logo { margin: 5px auto; }
#searchbox { width: 100%; }
#github { display: none; }
}

View File

@ -2,29 +2,25 @@
<html lang="en">
<head>
{% block head %}
<title>{% block title %}{% endblock %}</title>
<meta name="description" content="View on Libreddit, an alternative private front-end to Reddit.">
<title>{% block title %}Libreddit{% endblock %}</title>
<meta http-equiv="Referrer-Policy" content="no-referrer">
<meta http-equiv="Content-Security-Policy" content="default-src 'self'; style-src 'self' 'unsafe-inline'; base-uri 'none'; form-action 'self';">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta name="description" content="View on Libreddit, an alternative private front-end to Reddit.">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="stylesheet" href="/style.css">
{% block sortstyle %}
<style>
#sort > #sort_{{ sort }} {
background: aqua;
color: black;
}
</style>
{% endblock %}
{% endblock %}
</head>
</head>
<body>
{% block body %}
{% block header %}
<header>
<a href="/"><span id="lib">lib</span>reddit.</a>
<!-- NAVIGATION BAR -->
<nav>
<a id="logo" href="/"><span id="lib">lib</span>reddit. <span id="version">v{{ env!("CARGO_PKG_VERSION") }}</span></a>
{% block search %}{% endblock %}
<a id="github" href="https://github.com/spikecodes/libreddit">GITHUB</a>
</header>
{% endblock %}
</nav>
<!-- MAIN CONTENT -->
{% block body %}
<main>
{% block content %}
{% endblock %}

View File

@ -1,43 +0,0 @@
{% extends "base.html" %}
{% block title %}Libreddit{% endblock %}
{% block content %}
<div id="sort">
<div id="sort_hot"><a href="?sort=hot">Hot</a></div>
<div id="sort_top"><a href="?sort=top">Top</a></div>
<div id="sort_new"><a href="?sort=new">New</a></div>
<div id="sort_rising"><a href="?sort=rising">Rising</a></div>
</div>
{% for post in posts %}
<div class="post">
<div class="post_left">
<h3 class="post_score">{{ post.score }}</h3>
</div>
<div class="post_right">
<p>
<b><a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a></b>
&bull;
Posted by
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
<span style="float: right;">{{ post.time }}</span>
</p>
<h3 class="post_title">
{% if post.flair.0 != "" %}
<small style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
<a href="{{ post.url }}">{{ post.title }}</a>
</h3>
</div>
<img class="post_thumbnail" src="{{ post.media }}">
</div><br>
{% endfor %}
<footer>
{% if ends.0 != "" %}
<a href="?before={{ ends.0 }}">PREV</a>
{% endif %}
{% if ends.1 != "" %}
<a href="?after={{ ends.1 }}">NEXT</a>
{% endif %}
</footer>
{% endblock %}

View File

@ -1,52 +1,121 @@
{% extends "base.html" %}
{% import "utils.html" as utils %}
{% block title %}{{ post.title }} - r/{{ post.community }}{% endblock %}
{% block search %}
{% call utils::search(["/r/", post.community.as_str()].concat(), "") %}
{% endblock %}
{% block root %}/r/{{ post.community }}{% endblock %}{% block location %}r/{{ post.community }}{% endblock %}
{% block head %}
{% call super() %}
<meta name="author" content="u/{{ post.author }}">
{% endblock %}
<!-- OPEN COMMENT MACRO -->
{% macro comment(item) -%}
<div id="{{ item.id }}" class="comment">
<div class="comment_left">
<p class="comment_score">{{ item.score }}</p>
<div class="line"></div>
</div>
<details class="comment_right" open>
<summary class="comment_data"><a class="comment_author {% if item.author == post.author %}op{% endif %}" href="/u/{{ item.author }}">u/{{ item.author }}</a>
{% if item.flair.0 != "" %}
<small class="author_flair">{{ item.flair.0 }}</small>
{% endif %}
<span class="datetime">{{ item.time }}</span>
</summary>
<p class="comment_body">{{ item.body }}</p>
{%- endmacro %}
<!-- CLOSE COMMENT MACRO -->
{% macro close() %}
</details></div>
{% endmacro %}
{% block content %}
<div class="post highlighted">
<div class="post_left">
<h3 class="post_score">{{ post.score }}</h3>
<div id="column_one">
<!-- POST CONTENT -->
<div class="post highlighted panel">
<div class="post_left">
<p class="post_score">{{ post.score }}</p>
{% if post.flags.nsfw %}<div class="nsfw">NSFW</div>{% endif %}
</div>
<div class="post_right">
<div class="post_text">
<p class="post_header">
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
<span class="dot">&bull;</span>
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
{% if post.author_flair.0 != "" %}
<small class="author_flair">{{ post.author_flair.0 }}</small>
{% endif %}
<span class="dot">&bull;</span>
<span class="datetime">{{ post.time }}</span>
</p>
<a href="{{ post.permalink }}" class="post_title">
{{ post.title }}
{% if post.flair.0 != "" %}
<small class="post_flair" style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
</a>
<!-- POST MEDIA -->
{% if post.post_type == "image" %}
<img class="post_media" src="{{ post.media }}"/>
{% else if post.post_type == "video" %}
<video class="post_media" src="{{ post.media }}" controls autoplay loop>
{% else if post.post_type == "link" %}
<a id="post_url" href="{{ post.media }}">{{ post.media }}</a>
{% endif %}
<!-- POST BODY -->
<div class="post_body">{{ post.body }}</div>
<div id="post_footer">
<ul id="post_links">
<li><a href="/{{ post.id }}">permalink</a></li>
<li><a href="https://reddit.com/{{ post.id }}">reddit</a></li>
</ul>
<p>{{ post.upvote_ratio }}% Upvoted</p>
</div>
</div>
</div>
</div>
<div class="post_right">
<p>
<b><a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a></b>
&bull;
Posted by
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
<span>{{ post.time }}</span>
</p>
<h3 class="post_title">
{{ post.title }}
{% if post.flair.0 != "" %}
<small style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
</h3>
{{ post.media }}
<h4 class="post_body">{{ post.body }}</h4>
<!-- SORT FORM -->
<form id="sort">
<select name="sort">
{% call utils::options(sort, ["confidence", "top", "new", "controversial", "old"], "") %}
</select><input id="sort_submit" type="submit" value="&rarr;">
</form>
<!-- COMMENTS -->
{% for c in comments -%}
<div class="thread">
<!-- EACH COMMENT -->
{% call comment(c) %}
<div class="replies">{% for reply1 in c.replies %}{% call comment(reply1) %}
<!-- FIRST-LEVEL REPLIES -->
<div class="replies">{% for reply2 in reply1.replies %}{% call comment(reply2) %}
<!-- SECOND-LEVEL REPLIES -->
<div class="replies">{% for reply3 in reply2.replies %}{% call comment(reply3) %}
<!-- THIRD-LEVEL REPLIES -->
{% if reply3.replies.len() > 0 %}
<!-- LINK TO CONTINUE REPLIES -->
<a class="deeper_replies" href="{{ post.permalink }}{{ reply3.id }}">&rarr; More replies</a>
{% endif %}
{% call close() %}
{% endfor %}
</div>{% call close() %}
{% endfor %}
</div>{% call close() %}
{% endfor %}
</div>{% call close() %}
</div>
{%- endfor %}
</div>
<div id="sort">
<div id="sort_confidence"><a href="?sort=confidence">Best</a></div>
<div id="sort_top"><a href="?sort=top">Top</a></div>
<div id="sort_new"><a href="?sort=new">New</a></div>
<div id="sort_controversial"><a href="?sort=controversial">Controversial</a></div>
<div id="sort_old"><a href="?sort=old">Old</a></div>
</div>
{% for comment in comments %}
<div class="comment">
<div class="comment_left">
<div class="comment_upvote"></div>
<h3 class="comment_score">{{ comment.score }}</h3>
</div>
<div class="comment_right">
<p>
Posted by <a class="comment_author" href="/u/{{ comment.author }}">u/{{ comment.author }}</a>
<span>{{ comment.time }}</span>
</p>
<h4 class="comment_body">{{ comment.body }}</h4>
</div>
</div><br>
{% endfor %}
{% endblock %}

82
templates/search.html Normal file
View File

@ -0,0 +1,82 @@
{% extends "base.html" %}
{% import "utils.html" as utils %}
{% block title %}Libreddit: search results - {{ params.q }}{% endblock %}
{% block content %}
<div id="column_one">
<form id="search_sort">
<input id="search" type="text" name="q" placeholder="Search" value="{{ params.q }}">
{% if sub != "" %}
<div id="inside">
<input type="checkbox" name="restrict_sr" id="restrict_sr" {% if params.restrict_sr != "" %}checked{% endif %}>
<label for="restrict_sr">in r/{{ sub }}</label>
</div>
{% endif %}
<select id="sort_options" name="sort">
{% call utils::options(params.sort, ["relevance", "hot", "top", "new", "comments"], "") %}
</select>{% if params.sort != "new" %}<select id="timeframe" name="t">
{% call utils::options(params.t, ["hour", "day", "week", "month", "year", "all"], "all") %}
</select>{% endif %}<input id="sort_submit" type="submit" value="&rarr;">
</form>
{% for post in posts %}
{% if post.title != "Comment" %}
<div class="post panel">
<div class="post_left">
<p class="post_score">{{ post.score }}</p>
{% if post.flags.nsfw %}<div class="nsfw">NSFW</div>{% endif %}
</div>
<div class="post_right">
<div class="post_text">
<p class="post_header">
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
<span class="dot">&bull;</span>
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
{% if post.author_flair.0 != "" %}
<small class="author_flair">{{ post.author_flair.0 }}</small>
{% endif %}
<span class="dot">&bull;</span>
<span class="datetime">{{ post.time }}</span>
</p>
<p class="post_title">
{% if post.flair.0 != "" %}
<small class="post_flair" style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
<a href="{{ post.permalink }}">{{ post.title }}</a>
</p>
</div>
<img class="post_thumbnail" src="{{ post.media }}">
</div>
</div>
{% else %}
<div class="comment">
<div class="comment_left">
<p class="comment_score">{{ post.score }}</p>
<div class="line"></div>
</div>
<details class="comment_right" open>
<summary class="comment_data">
<a class="comment_link" href="{{ post.permalink }}">COMMENT</a>
<span class="datetime">{{ post.time }}</span>
</summary>
<p class="comment_body">{{ post.body }}</p>
</details>
</div>
{% endif %}
{% endfor %}
<footer>
{% if params.before != "" %}
<a href="?q={{ params.q }}&restrict_sr={{ params.restrict_sr }}
&sort={{ params.sort }}&t={{ params.t }}
&before={{ params.before }}">PREV</a>
{% endif %}
{% if params.after != "" %}
<a href="?q={{ params.q }}&restrict_sr={{ params.restrict_sr }}
&sort={{ params.sort }}&t={{ params.t }}
&after={{ params.after }}">NEXT</a>
{% endif %}
</footer>
</div>
{% endblock %}

18
templates/settings.html Normal file
View File

@ -0,0 +1,18 @@
{% extends "base.html" %}
{% import "utils.html" as utils %}
{% block title %}Libreddit Settings{% endblock %}
{% block search %}
{% call utils::search("".to_owned(), "", "") %}
{% endblock %}
{% block body %}
<main>
<form action="/settings/save" method="POST">
<label for="pref_nsfw">NSFW</label>
<input type="checkbox" name="pref_nsfw" id="pref_nsfw" {% if pref_nsfw == "on" %}checked{% endif %}>
<input id="sort_submit" type="submit" value="&rarr;">
</form>
</main>
{% endblock %}

View File

@ -1,62 +1,96 @@
{% extends "base.html" %}
{% block title %}r/{{ sub.name }}: {{ sub.description }}{% endblock %}
{% import "utils.html" as utils %}
{% block title %}
{% if sub.title != "" %}{{ sub.title }}
{% else if sub.name != "" %}{{ sub.name }}
{% else %}Libreddit{% endif %}
{% endblock %}
{% block search %}
{% call utils::search(["/r/", sub.name.as_str()].concat(), "") %}
{% endblock %}
{% block body %}
{% block header %}
<header>
<a href="/"><span id="lib">lib</span>reddit.</a>
<a id="github" href="https://github.com/spikecodes/libreddit">GITHUB</a>
</header>
{% endblock %}
<div id="about">
<div class="subreddit">
<div class="subreddit_left">
{{ sub.icon }}
</div>
<div class="subreddit_right">
<h2 class="subreddit_name">r/{{ sub.name }}</h2>
<p class="subreddit_description">{{ sub.description }}</p>
<div id="stats">👤 {{ sub.members }} 🟢 {{ sub.active }}</div>
</div>
</div>
</div>
<main>
<div id="sort">
<div id="sort_hot"><a href="?sort=hot">Hot</a></div>
<div id="sort_top"><a href="?sort=top">Top</a></div>
<div id="sort_new"><a href="?sort=new">New</a></div>
</div>
{% for post in posts %}
<div class="post">
<div class="post_left">
<h3 class="post_score">{{ post.score }}</h3>
</div>
<div class="post_right">
<p>
<b><a class="post_subreddit" href="/r/{{ post.community }}">r/{{ sub.name }}</a></b>
&bull;
Posted by
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
<span>{{ post.time }}</span>
</p>
<h3 class="post_title">
{% if post.flair.0 != "" %}
<small style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
<div id="column_one">
<form id="sort">
<div id="sort_options">
{% if sub.name.is_empty() %}
{% call utils::sort("", ["hot", "new", "top", "rising", "controversial"], sort.0) %}
{% else %}
{% call utils::sort(["/r/", sub.name.as_str()].concat(), ["hot", "new", "top", "rising", "controversial"], sort.0) %}
{% endif %}
<a href="{{ post.url }}">{{ post.title }}</a>
</h3>
</div>
{% if sort.0 == "top" || sort.0 == "controversial" %}<select id="timeframe" name="t">
{% call utils::options(sort.1, ["hour", "day", "week", "month", "year", "all"], "day") %}
<input id="sort_submit" type="submit" value="&rarr;">
</select>{% endif %}
</form>
{% for post in posts %}
<div class="post {% if post.flags.stickied %}stickied{% endif %} panel">
<div class="post_left">
<p class="post_score">{{ post.score }}</p>
{% if post.flags.nsfw %}<div class="nsfw">NSFW</div>{% endif %}
</div>
<div class="post_right">
<div class="post_text">
<p class="post_header">
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
<span class="dot">&bull;</span>
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
<span class="dot">&bull;</span>
<span class="datetime">{{ post.time }}</span>
</p>
<p class="post_title">
{% if post.flair.0 != "" %}
<small class="post_flair" style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
<a href="{{ post.permalink }}">{{ post.title }}</a>
</p>
</div>
<img class="post_thumbnail" src="{{ post.media }}">
</div>
</div>
<img class="post_thumbnail" src="{{ post.media }}">
</div><br>
{% endfor %}
{% endfor %}
<footer>
{% if ends.0 != "" %}
<a href="?before={{ ends.0 }}">PREV</a>
{% endif %}
<footer>
{% if ends.0 != "" %}
<a href="?sort={{ sort.0 }}&before={{ ends.0 }}">PREV</a>
{% endif %}
{% if ends.1 != "" %}
<a href="?after={{ ends.1 }}">NEXT</a>
{% endif %}
</footer>
{% if ends.1 != "" %}
<a href="?sort={{ sort.0 }}&after={{ ends.1 }}">NEXT</a>
{% endif %}
</footer>
</div>
{% if sub.name != "" %}
<aside>
<div class="panel" id="subreddit">
{% if sub.wiki %}
<div id="top">
<div>Posts</div>
<a href="/r/{{ sub.name }}/wiki/index">Wiki</a>
</div>
{% endif %}
<div id="sub_meta">
<img id="sub_icon" src="{{ sub.icon }}">
<p id="sub_title">{{ sub.title }}</p>
<p id="sub_name">r/{{ sub.name }}</p>
<p id="sub_description">{{ sub.description }}</p>
<div id="sub_details">
<label>Members</label>
<label>Active</label>
<div>{{ sub.members }}</div>
<div>{{ sub.active }}</div>
</div>
</div>
</div>
<details class="panel" id="sidebar">
<summary id="sidebar_label">Sidebar</summary>
<div id="sidebar_contents">{{ sub.info }}</div>
</details>
</aside>
{% endif %}
</main>
{% endblock %}

View File

@ -1,69 +1,89 @@
{% extends "base.html" %}
{% block title %}Libreddit: u/{{ user.name }}{% endblock %}
{% import "utils.html" as utils %}
{% block search %}
{% call utils::search("".to_owned(), "", "") %}
{% endblock %}
{% block title %}{{ user.name.replace("u/", "") }} (u/{{ user.name }}) - Libreddit{% endblock %}
{% block body %}
{% block header %}
<header>
<a href="/"><span id="lib">lib</span>reddit.</a>
<a id="github" href="https://github.com/spikecodes/libreddit">GITHUB</a>
</header>
{% endblock %}
<div id="about">
<div class="user">
<div class="user_left">
<img class="user_icon" src="{{ user.icon }}">
<main style="max-width: 1000px;">
<div id="column_one">
<form id="sort">
<select name="sort">
{% call utils::options(sort.0, ["hot", "new", "top"], "") %}
</select>{% if sort.0 == "top" %}<select id="timeframe" name="t">
{% call utils::options(sort.1, ["hour", "day", "week", "month", "year", "all"], "all") %}
</select>{% endif %}<input id="sort_submit" type="submit" value="&rarr;">
</form>
{% for post in posts %}
{% if post.title != "Comment" %}
<div class="post panel">
<div class="post_left">
<p class="post_score">{{ post.score }}</p>
{% if post.flags.nsfw %}<div class="nsfw">NSFW</div>{% endif %}
</div>
<div class="post_right">
<div class="post_text">
<p class="post_header">
<a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a>
{% if post.author_flair.0 != "" %}
<small class="author_flair">{{ post.author_flair.0 }}</small>
{% endif %}
<span class="dot">&bull;</span>
<span class="datetime" style="float: right;">{{ post.time }}</span>
</p>
<p class="post_title">
{% if post.flair.0 == "Comment" %}
{% else if post.flair.0 == "" %}
{% else %}
<small class="post_flair" style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
<a href="{{ post.permalink }}">{{ post.title }}</a>
</p>
</div>
<img class="post_thumbnail" src="{{ post.media }}">
</div>
</div>
<div class="user_right">
<h2 class="user_name">u/{{ user.name }}</h2>
<p class="user_description"><span>Karma:</span> {{ user.karma }} | <span>Description:</span> "{{ user.description }}"</p>
{% else %}
<div class="comment">
<div class="comment_left">
<p class="comment_score">{{ post.score }}</p>
<div class="line"></div>
</div>
<details class="comment_right" open>
<summary class="comment_data">
<a class="comment_link" href="{{ post.permalink }}">COMMENT</a>
<span class="datetime">{{ post.time }}</span>
</summary>
<p class="comment_body">{{ post.body }}</p>
</details>
</div>
{% endif %}
{% endfor %}
<footer>
{% if ends.0 != "" %}
<a href="?sort={{ sort.0 }}&before={{ ends.0 }}">PREV</a>
{% endif %}
{% if ends.1 != "" %}
<a href="?sort={{ sort.0 }}&after={{ ends.1 }}">NEXT</a>
{% endif %}
</footer>
</div>
</div>
<main>
<div id="sort">
<div id="sort_hot"><a href="?sort=hot">Hot</a></div>
<div id="sort_top"><a href="?sort=top">Top</a></div>
<div id="sort_new"><a href="?sort=new">New</a></div>
</div>
{% for post in posts %}
{% if post.title != "Comment" %}
<div class='post'>
<div class="post_left">
<h3 class="post_score">{{ post.score }}</h3>
<aside>
<div class="panel" id="user">
<img id="user_icon" src="{{ user.icon }}">
<p id="user_title">{{ user.title }}</p>
<p id="user_name">u/{{ user.name }}</p>
<div id="user_description">{{ user.description }}</div>
<div id="user_details">
<label>Karma</label>
<label>Created</label>
<div>{{ user.karma }}</div>
<div>{{ user.created }}</div>
</div>
</div>
<div class="post_right">
<p>
<b><a class="post_subreddit" href="/r/{{ post.community }}">r/{{ post.community }}</a></b>
&bull;
Posted by
<a class="post_author" href="/u/{{ post.author }}">u/{{ post.author }}</a>
<span style="float: right;">{{ post.time }}</span>
</p>
<h3 class="post_title">
{% if post.flair.0 == "Comment" %}
{% else if post.flair.0 == "" %}
{% else %}
<small style="color:{{ post.flair.2 }}; background:{{ post.flair.1 }}">{{ post.flair.0 }}</small>
{% endif %}
<a href="{{ post.url }}">{{ post.title }}</a>
</h3>
</div>
<img class="post_thumbnail" src="{{ post.media }}">
</div><br>
{% else %}
<div class="comment">
<div class="comment_left">
<div class="comment_upvote"></div>
<h3 class="comment_score">{{ post.score }}</h3>
</div>
<div class="comment_right">
<p>
COMMENT
<span>{{ post.time }}</span>
</p>
<h4 class="comment_body">{{ post.body }}</h4>
</div>
</div><br>
{% endif %}
{% endfor %}
</aside>
</main>
{% endblock %}

28
templates/utils.html Normal file
View File

@ -0,0 +1,28 @@
{% macro options(current, values, default) -%}
{% for value in values %}
<option value="{{ value }}" {% if current == value || (current == "" && value == default) %}selected{% endif %}>
{{ format!("{}{}", value.get(0..1).unwrap().to_uppercase(), value.get(1..).unwrap()) }}
</option>
{% endfor %}
{%- endmacro %}
{% macro sort(root, methods, selected) -%}
{% for method in methods %}
<a {% if method == selected %}class="selected"{% endif %} href="{{ root }}/{{ method }}">
{{ format!("{}{}", method.get(0..1).unwrap().to_uppercase(), method.get(1..).unwrap()) }}
</a>
{% endfor %}
{%- endmacro %}
{% macro search(root, search) -%}
<form action="{% if root != "/r/" && !root.is_empty() %}{{ root }}{% endif %}/search/" id="searchbox">
<input id="search" type="text" name="q" placeholder="Search" value="{{ search }}">
{% if root != "/r/" && !root.is_empty() %}
<div id="inside">
<input type="checkbox" name="restrict_sr" id="restrict_sr">
<label for="restrict_sr">in {{ root }}</label>
</div>
{% endif %}
<input type="submit" value="&rarr;">
</form>
{%- endmacro %}

25
templates/wiki.html Normal file
View File

@ -0,0 +1,25 @@
{% extends "base.html" %}
{% import "utils.html" as utils %}
{% block title %}
{% if sub != "" %}{{ page }} - {{ sub }}
{% else %}Libreddit{% endif %}
{% endblock %}
{% block search %}
{% call utils::search(["/r/", sub.as_str()].concat(), "") %}
{% endblock %}
{% block body %}
<main>
<div class="panel" id="column_one">
<div id="top">
<a href="/r/{{ sub }}">Posts</a>
<div>Wiki</div>
</div>
<div id="wiki">
{{ wiki }}
</div>
</div>
</main>
{% endblock %}