Compare commits

...

26 Commits

Author SHA1 Message Date
ngn
356d516e28 [skip ci] fix the docker badge URL
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-23 01:14:33 +03:00
ngn
bdfa3a3ba2 fix the workdir for the docker build image
All checks were successful
docker / nitter (push) Successful in 5m3s
docker / session (push) Successful in 11s
ups / ups (push) Successful in 39s
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-23 01:08:04 +03:00
ngn
c5ffb16340 remove reference to instancesurl
Some checks failed
docker / nitter (push) Failing after 4m59s
docker / session (push) Successful in 11s
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-23 00:59:42 +03:00
ngn
1d1c354331 remove old markdown depend
Some checks failed
docker / session (push) Has been cancelled
docker / nitter (push) Has been cancelled
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-23 00:57:43 +03:00
ngn
062f923b5d [skip ci] add the renovate config
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-23 00:56:13 +03:00
ngn
d221df59df apply patches from the old upstream and cleanup
Some checks failed
docker / nitter (push) Failing after 2m29s
docker / session (push) Successful in 1m20s
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-23 00:53:45 +03:00
ngn
9808c6a543 add the ups workflow
Signed-off-by: ngn <ngn@ngn.tf>
2025-05-13 00:11:10 +03:00
ngn
2002de7851 [skip ci] update the image name in example compose
Signed-off-by: ngn <ngn@ngn.tf>
2025-03-24 23:44:26 +03:00
ngn
6703064cbd remove unused test scripts
All checks were successful
Build the docker image for the get_account.py script / build (push) Successful in 1m13s
Build the docker image for the web server / build (push) Successful in 8m22s
Signed-off-by: ngn <ngn@ngn.tf>
2025-03-24 23:41:46 +03:00
ngn
ad931427d0 use the correct Dockerfile
Signed-off-by: ngn <ngn@ngn.tf>
2025-03-24 23:40:39 +03:00
ngn
de8f69b182 fix the image names for the docker workflows
Some checks failed
Build the docker image for the web server / build (push) Waiting to run
Build the docker image for the get_account.py script / build (push) Has been cancelled
Signed-off-by: ngn <ngn@ngn.tf>
2025-03-24 23:36:28 +03:00
ngn
f1078aa647 separate the workflows for the docker images
Some checks failed
Build the docker image for the get_account.py script / build (push) Has been cancelled
Build the docker image for the web server / build (push) Has been cancelled
Signed-off-by: ngn <ngn@ngn.tf>
2025-03-24 23:34:12 +03:00
ngn
1ed15ef433 add the get_account.py script from upstream
Some checks failed
Build and publish the docker images / build (push) Has been cancelled
Signed-off-by: ngn <ngn@ngn.tf>
2025-03-24 23:26:17 +03:00
ngn
398ba2a9a5 [skip ci] add renovate config
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 10:23:10 +03:00
ngn
54763be57a fix default pref config
All checks were successful
Build and publish the docker image / build (push) Successful in 9m7s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 07:52:56 +03:00
ngn
fec37e8b76 switch to alpine for the runner bc of lib issues
All checks were successful
Build and publish the docker image / build (push) Successful in 8m57s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 07:07:53 +03:00
ngn
b6753bf862 ill kms
All checks were successful
Build and publish the docker image / build (push) Successful in 9m5s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 06:53:53 +03:00
ngn
3d16d4c361 im stupid
All checks were successful
Build and publish the docker image / build (push) Successful in 8m53s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 06:43:52 +03:00
ngn
e797c0d218 fix permissions fr
All checks were successful
Build and publish the docker image / build (push) Successful in 9m2s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 06:32:52 +03:00
ngn
f14163ba8a fix permissioning in the dockerfile
All checks were successful
Build and publish the docker image / build (push) Successful in 11m44s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 06:15:25 +03:00
ngn
2d61656ddb actual shit syntax
All checks were successful
Build and publish the docker image / build (push) Successful in 9m24s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 05:48:52 +03:00
ngn
410349f615 what the fuck is this shit syntax
Some checks failed
Build and publish the docker image / build (push) Failing after 16s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 05:38:43 +03:00
ngn
f3c2ac1417 remove missing depend install from Dockerfile
Some checks failed
Build and publish the docker image / build (push) Failing after 1m30s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 05:33:50 +03:00
ngn
cd580c3e22 use the correct package maanger in Dockerfile
Some checks failed
Build and publish the docker image / build (push) Failing after 15s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 05:31:38 +03:00
ngn
48b1d9e565 fix video playback
Some checks failed
Build and publish the docker image / build (push) Failing after 14s
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 05:29:57 +03:00
ngn
96dad1e3a1 remove about page
Signed-off-by: ngn <ngn@ngn.tf>
2025-01-21 05:26:35 +03:00
50 changed files with 671 additions and 944 deletions

View File

@ -1,28 +0,0 @@
name: Build and publish the docker image
on:
push:
branches: ["custom"]
env:
REGISTRY: git.ngn.tf
IMAGE: ${{gitea.repository}}
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: "https://github.com/actions/checkout@v4"
- name: Login to container repo
uses: "https://github.com/docker/login-action@v1"
with:
registry: ${{env.REGISTRY}}
username: ${{gitea.actor}}
password: ${{secrets.PACKAGES_TOKEN}}
- name: Build image
run: |
docker build . --tag ${{env.REGISTRY}}/${{env.IMAGE}}:latest
docker push ${{env.REGISTRY}}/${{env.IMAGE}}:latest

View File

@ -0,0 +1,48 @@
name: docker
on:
push:
branches: ["main"]
env:
REGISTRY: git.ngn.tf
IMAGE: ${{gitea.repository}}
jobs:
nitter:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Login to container repo
uses: docker/login-action@v3
with:
registry: ${{env.REGISTRY}}
username: ${{gitea.actor}}
password: ${{secrets.PACKAGES_TOKEN}}
- name: Build and push the image
run: |
docker build . -f docker/nitter.Dockerfile \
--tag ${{env.REGISTRY}}/${{env.IMAGE}}:latest
docker push ${{env.REGISTRY}}/${{env.IMAGE}}:latest
session:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Login to container repo
uses: docker/login-action@v3
with:
registry: ${{env.REGISTRY}}
username: ${{gitea.actor}}
password: ${{secrets.PACKAGES_TOKEN}}
- name: Build and push the image
run: |
docker build . -f docker/session.Dockerfile \
--tag ${{env.REGISTRY}}/${{env.IMAGE}}/session:latest
docker push ${{env.REGISTRY}}/${{env.IMAGE}}/session:latest

25
.gitea/workflows/ups.yml Normal file
View File

@ -0,0 +1,25 @@
name: ups
on:
schedule:
- cron: "@weekly"
jobs:
ups:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Install dependencies
run: |
sudo apt update -y
sudo apt install -y python3 python3-build python3-requests make
- name: Install ups
run: |
git clone https://git.ngn.tf/ngn/ups && cd ups
make && make install
- name: Run ups
run: PATH=~/.local/bin:$PATH ups-check

23
.gitignore vendored
View File

@ -1,16 +1,17 @@
nitter
*.html
*.db
data
tests/__pycache__
tests/geckodriver.log
tests/downloaded_files
tests/latest_logs
tools/gencss
tools/rendermd
public/css/style.css
public/md/*.html
/tests/__pycache__
/tests/geckodriver.log
/tests/downloaded_files
/tests/latest_logs
/tools/gencss
/tools/rendermd
/public/css/style.css
/public/md/*.html
nitter.conf
compose.yml
accounts.*
guest_accounts.json*
sessions.json*
dump.rdb
docker-compose.yml
compose.yml

View File

@ -1,26 +0,0 @@
FROM nimlang/nim:2.0.0-alpine-regular as build
RUN apk --no-cache add libsass-dev pcre
WORKDIR /src
COPY nitter.nimble .
RUN nimble install -y --depsOnly
COPY . .
RUN nimble build -d:danger -d:lto -d:strip
RUN nimble scss
RUN nimble md
FROM alpine:latest
RUN apk --no-cache add pcre ca-certificates
RUN useradd -d /src -u 1001 nitter
WORKDIR /srv
COPY --from=build /srv/nitter ./
COPY --from=build /srv/public ./public
USER nitter
CMD ./nitter

View File

@ -1,5 +1,7 @@
# [ngn.tf] | nitter
# nitter - alternative Twitter frontend
![](https://git.ngn.tf/ngn/nitter/actions/workflows/build.yml/badge.svg)
![](https://git.ngn.tf/ngn/nitter/actions/workflows/docker.yml/badge.svg)
![](https://git.ngn.tf/ngn/nitter/actions/workflows/ups.yml/badge.svg)
A fork of the [nitter](https://github.com/PrivacyDevel/nitter) project, with my personal changes.
A fork of the [nitter](https://github.com/zedeus/nitter) project, with my
personal changes.

View File

@ -3,18 +3,18 @@ services:
container_name: nitter
image: git.ngn.tf/ngn/nitter
ports:
- 80:8080
- 80:8080
volumes:
- ./nitter.conf:/srv/nitter.conf:Z,ro
- ./accounts.jsonl:/srv/accounts.jsonl:Z,ro
- ./nitter.conf:/srv/nitter.conf:Z,ro
- ./sessions.jsonl:/srv/sessions.jsonl:Z,ro
depends_on:
- nitter-redis
- nitter_redis
restart: unless-stopped
user: 998:998
security_opt:
- no-new-privileges:true
- no-new-privileges:true
cap_drop:
- ALL
- ALL
read_only: true
nitter_redis:
@ -22,11 +22,11 @@ services:
image: redis:6-alpine
command: redis-server --save 60 1 --loglevel warning
volumes:
- ./data:/data
- ./data:/data
restart: unless-stopped
user: 999:1000
security_opt:
- no-new-privileges:true
- no-new-privileges:true
cap_drop:
- ALL
- ALL
read_only: true

28
docker/nitter.Dockerfile Normal file
View File

@ -0,0 +1,28 @@
# builds nitter
FROM nimlang/nim:2.2.0-alpine-regular as build
RUN apk --no-cache add libsass-dev pcre
WORKDIR /src
COPY nitter.nimble .
RUN nimble install -y --depsOnly
COPY . .
RUN nimble build -d:danger -d:lto -d:strip --mm:refc && \
nimble scss
# runs nitter
FROM alpine:latest
RUN apk --no-cache add pcre ca-certificates
WORKDIR /srv
COPY --from=build /src/nitter ./
COPY --from=build /src/public ./public
RUN adduser -h /srv -D -s /bin/sh -u 1001 runner && \
chown runner:runner -R /srv
USER runner
CMD ./nitter

View File

@ -0,0 +1,6 @@
FROM python
RUN pip install pyotp requests
COPY ./tools/get_session.py /get_session.py
ENTRYPOINT ["python3", "/get_session.py"]

View File

@ -1,4 +1,4 @@
[server]
[Server]
hostname = "nitter.net" # for generating links, change this to your own domain/ip
title = "nitter"
address = "0.0.0.0"
@ -6,12 +6,11 @@ port = 8080
https = false # disable to enable cookies when not using https
httpMaxConnections = 100
staticDir = "./public"
accountsFile = "./accounts.jsonl"
[cache]
[Cache]
listMinutes = 240 # how long to cache list info (not the tweets, so keep it high)
rssMinutes = 10 # how long to cache rss queries
redisHost = "localhost" # Change to "nitter-redis" if using docker-compose
redisHost = "localhost" # Change to "nitter_redis" if using docker-compose
redisPort = 6379
redisPassword = ""
redisConnections = 20 # minimum open connections in pool
@ -20,22 +19,16 @@ redisMaxConnections = 30
# goes above this, they're closed when released. don't worry about this unless
# you receive tons of requests per second
[config]
[Config]
hmacKey = "secretkey" # random key for cryptographic signing of video urls
base64Media = false # use base64 encoding for proxied media urls
enableRSS = true # set this to false to disable RSS feeds
enableDebug = false # enable request logs and debug endpoints (/.accounts)
enableDebug = false # enable request logs and debug endpoints (/.sessions)
proxy = "" # http/https url, SOCKS proxies are not supported
proxyAuth = ""
tokenCount = 10
# minimum amount of usable tokens. tokens are used to authorize API requests,
# but they expire after ~1 hour, and have a limit of 500 requests per endpoint.
# the limits reset every 15 minutes, and the pool is filled up so there's
# always at least `tokenCount` usable tokens. only increase this if you receive
# major bursts all the time and don't have a rate limiting setup via e.g. nginx
# Change default preferences here, see src/prefs_impl.nim for a complete list
[preferences]
[Preferences]
theme = "Nitter"
replaceTwitter = "nitter.net"
replaceYouTube = "piped.video"

View File

@ -15,7 +15,6 @@ requires "jester#baca3f"
requires "karax#5cf360c"
requires "sass#7dfdd03"
requires "nimcrypto#a079df9"
requires "markdown#158efe3"
requires "packedjson#9e6fbb6"
requires "supersnappy#6c94198"
requires "redpool#8b7c1db"
@ -29,6 +28,3 @@ requires "oauth#b8c163b"
task scss, "Generate css":
exec "nimble c --hint[Processing]:off -d:danger -r tools/gencss"
task md, "Render md":
exec "nimble c --hint[Processing]:off -d:danger -r tools/rendermd"

File diff suppressed because one or more lines are too long

5
public/js/hls.min.js vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -1,54 +0,0 @@
# About
Nitter is a free and open source alternative Twitter front-end focused on
privacy and performance. The source is available on GitHub at
<https://github.com/zedeus/nitter>
* No JavaScript or ads
* All requests go through the backend, client never talks to Twitter
* Prevents Twitter from tracking your IP or JavaScript fingerprint
* Uses Twitter's unofficial API (no rate limits or developer account required)
* Lightweight (for [@nim_lang](/nim_lang), 60KB vs 784KB from twitter.com)
* RSS feeds
* Themes
* Mobile support (responsive design)
* AGPLv3 licensed, no proprietary instances permitted
Nitter's GitHub wiki contains
[instances](https://github.com/zedeus/nitter/wiki/Instances) and
[browser extensions](https://github.com/zedeus/nitter/wiki/Extensions)
maintained by the community.
## Why use Nitter?
It's impossible to use Twitter without JavaScript enabled. For privacy-minded
folks, preventing JavaScript analytics and IP-based tracking is important, but
apart from using a VPN and uBlock/uMatrix, it's impossible. Despite being behind
a VPN and using heavy-duty adblockers, you can get accurately tracked with your
[browser's fingerprint](https://restoreprivacy.com/browser-fingerprinting/),
[no JavaScript required](https://noscriptfingerprint.com/). This all became
particularly important after Twitter [removed the
ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws)
for users to control whether their data gets sent to advertisers.
Using an instance of Nitter (hosted on a VPS for example), you can browse
Twitter without JavaScript while retaining your privacy. In addition to
respecting your privacy, Nitter is on average around 15 times lighter than
Twitter, and in most cases serves pages faster (eg. timelines load 2-4x faster).
In the future a simple account system will be added that lets you follow Twitter
users, allowing you to have a clean chronological timeline without needing a
Twitter account.
## Donating
Liberapay: <https://liberapay.com/zedeus> \
Patreon: <https://patreon.com/nitter> \
BTC: bc1qp7q4qz0fgfvftm5hwz3vy284nue6jedt44kxya \
ETH: 0x66d84bc3fd031b62857ad18c62f1ba072b011925 \
LTC: ltc1qhsz5nxw6jw9rdtw9qssjeq2h8hqk2f85rdgpkr \
XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscWGWJCczFLe9RFhM3d1zpL
## Contact
Feel free to join our [Matrix channel](https://matrix.to/#/#nitter:matrix.org).

5
renovate.json Normal file
View File

@ -0,0 +1,5 @@
{
"extends": ["config:recommended"],
"timezone": "Europe/Istanbul",
"prHourlyLimit": 20
}

View File

@ -1,52 +0,0 @@
#!/bin/bash -e
# Grab oauth token for use with Nitter (requires Twitter account).
# results: {"oauth_token":"xxxxxxxxxx-xxxxxxxxx","oauth_token_secret":"xxxxxxxxxxxxxxxxxxxxx"}
if [ $# -ne 2 ]; then
echo "please specify a username and password"
exit 1
fi
username="${1}"
password="${2}"
bearer_token='AAAAAAAAAAAAAAAAAAAAAFXzAwAAAAAAMHCxpeSDG1gLNLghVe8d74hl6k4%3DRUMF4xAQLsbeBhTSRrCiQpJtxoGWeyHrDb5te2jpGskWDFW82F'
guest_token=$(curl -s -XPOST https://api.twitter.com/1.1/guest/activate.json -H "Authorization: Bearer ${bearer_token}" | jq -r '.guest_token')
base_url='https://api.twitter.com/1.1/onboarding/task.json'
header=(-H "Authorization: Bearer ${bearer_token}" -H "User-Agent: TwitterAndroid/10.21.1" -H "Content-Type: application/json" -H "X-Guest-Token: ${guest_token}")
# start flow
flow_1=$(curl -si -XPOST "${base_url}?flow_name=login" "${header[@]}")
# get 'att', now needed in headers, and 'flow_token' from flow_1
att=$(sed -En 's/^att: (.*)\r/\1/p' <<< "${flow_1}")
flow_token=$(sed -n '$p' <<< "${flow_1}" | jq -r .flow_token)
if [[ -z "$flow_1" || -z "$flow_token" ]]; then
echo "Couldn't retrieve flow token (twitter not reachable?)"
exit 1
fi
# username
token_2=$(curl -s -XPOST "${base_url}" -H "att: ${att}" "${header[@]}" \
-d '{"flow_token":"'"${flow_token}"'","subtask_inputs":[{"subtask_id":"LoginEnterUserIdentifierSSO","settings_list":{"setting_responses":[{"key":"user_identifier","response_data":{"text_data":{"result":"'"${username}"'"}}}],"link":"next_link"}}]}' | jq -r .flow_token)
if [[ -z "$token_2" || "$token_2" == "null" ]]; then
echo "Couldn't retrieve user token (check if login is correct)"
exit 1
fi
# password
token_3=$(curl -s -XPOST "${base_url}" -H "att: ${att}" "${header[@]}" \
-d '{"flow_token":"'"${token_2}"'","subtask_inputs":[{"enter_password":{"password":"'"${password}"'","link":"next_link"},"subtask_id":"LoginEnterPassword"}]}' | jq -r .flow_token)
if [[ -z "$token_3" || "$token_3" == "null" ]]; then
echo "Couldn't retrieve user token (check if password is correct)"
exit 1
fi
# finally print oauth_token and secret
curl -s -XPOST "${base_url}" -H "att: ${att}" "${header[@]}" \
-d '{"flow_token":"'"${token_3}"'","subtask_inputs":[{"check_logged_in_account":{"link":"AccountDuplicationCheck_false"},"subtask_id":"AccountDuplicationCheck"}]}' | \
jq -c '.subtasks[0]|if(.open_account) then [{oauth_token: .open_account.oauth_token, oauth_token_secret: .open_account.oauth_token_secret}] else empty end'

View File

@ -69,23 +69,6 @@ proc getGraphListMembers*(list: List; after=""): Future[Result[User]] {.async.}
let url = graphListMembers ? {"variables": $variables, "features": gqlFeatures}
result = parseGraphListMembers(await fetchRaw(url, Api.listMembers), after)
proc getFavorites*(id: string; cfg: Config; after=""): Future[Profile] {.async.} =
if id.len == 0: return
var
variables = %*{
"userId": id,
"includePromotedContent":false,
"withClientEventToken":false,
"withBirdwatchNotes":false,
"withVoice":true,
"withV2Timeline":false
}
if after.len > 0:
variables["cursor"] = % after
let
url = consts.favorites ? {"variables": $variables, "features": gqlFeatures}
result = parseGraphTimeline(await fetch(url, Api.favorites), after)
proc getGraphTweetResult*(id: string): Future[Tweet] {.async.} =
if id.len == 0: return
let
@ -103,42 +86,6 @@ proc getGraphTweet(id: string; after=""): Future[Conversation] {.async.} =
js = await fetch(graphTweet ? params, Api.tweetDetail)
result = parseGraphConversation(js, id)
proc getGraphFavoriters*(id: string; after=""): Future[UsersTimeline] {.async.} =
if id.len == 0: return
let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = reactorsVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphFavoriters ? params, Api.favoriters)
result = parseGraphFavoritersTimeline(js, id)
proc getGraphRetweeters*(id: string; after=""): Future[UsersTimeline] {.async.} =
if id.len == 0: return
let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = reactorsVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphRetweeters ? params, Api.retweeters)
result = parseGraphRetweetersTimeline(js, id)
proc getGraphFollowing*(id: string; after=""): Future[UsersTimeline] {.async.} =
if id.len == 0: return
let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = followVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphFollowing ? params, Api.following)
result = parseGraphFollowTimeline(js, id)
proc getGraphFollowers*(id: string; after=""): Future[UsersTimeline] {.async.} =
if id.len == 0: return
let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = followVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphFollowers ? params, Api.followers)
result = parseGraphFollowTimeline(js, id)
proc getReplies*(id, after: string): Future[Result[Chain]] {.async.} =
result = (await getGraphTweet(id, after)).replies
result.beginning = after.len == 0
@ -189,13 +136,13 @@ proc getGraphUserSearch*(query: Query; after=""): Future[Result[User]] {.async.}
result = parseGraphSearch[User](await fetch(url, Api.search), after)
result.query = query
proc getPhotoRail*(name: string): Future[PhotoRail] {.async.} =
if name.len == 0: return
proc getPhotoRail*(id: string): Future[PhotoRail] {.async.} =
if id.len == 0: return
let
ps = genParams({"screen_name": name, "trim_user": "true"},
count="18", ext=false)
url = photoRail ? ps
result = parsePhotoRail(await fetch(url, Api.photoRail))
variables = userTweetsVariables % [id, ""]
params = {"variables": variables, "features": gqlFeatures}
url = graphUserMedia ? params
result = parseGraphPhotoRail(await fetch(url, Api.userMedia))
proc resolve*(url: string; prefs: Prefs): Future[string] {.async.} =
let client = newAsyncHttpClient(maxRedirects=0)

View File

@ -3,33 +3,14 @@ import httpclient, asyncdispatch, options, strutils, uri, times, math, tables
import jsony, packedjson, zippy, oauth1
import types, auth, consts, parserutils, http_pool
import experimental/types/common
import config
const
rlRemaining = "x-rate-limit-remaining"
rlReset = "x-rate-limit-reset"
errorsToSkip = {doesntExist, tweetNotFound, timeout, unauthorized, badRequest}
var pool: HttpPool
proc genParams*(pars: openArray[(string, string)] = @[]; cursor="";
count="20"; ext=true): seq[(string, string)] =
result = timelineParams
for p in pars:
result &= p
if ext:
result &= ("include_ext_alt_text", "1")
result &= ("include_ext_media_stats", "1")
result &= ("include_ext_media_availability", "1")
if count.len > 0:
result &= ("count", count)
if cursor.len > 0:
# The raw cursor often has plus signs, which sometimes get turned into spaces,
# so we need to turn them back into a plus
if " " in cursor:
result &= ("cursor", cursor.replace(" ", "+"))
else:
result &= ("cursor", cursor)
proc getOauthHeader(url, oauthToken, oauthTokenSecret: string): string =
let
encodedUrl = url.replace(",", "%2C").replace("+", "%20")
@ -49,50 +30,37 @@ proc getOauthHeader(url, oauthToken, oauthTokenSecret: string): string =
proc genHeaders*(url, oauthToken, oauthTokenSecret: string): HttpHeaders =
let header = getOauthHeader(url, oauthToken, oauthTokenSecret)
result = newHttpHeaders({
"connection": "keep-alive",
"authorization": header,
"content-type": "application/json",
"x-twitter-active-user": "yes",
"authority": "api.twitter.com",
"authority": "api.x.com",
"accept-encoding": "gzip",
"accept-language": "en-US,en;q=0.9",
"accept": "*/*",
"DNT": "1"
})
template updateAccount() =
if resp.headers.hasKey(rlRemaining):
let
remaining = parseInt(resp.headers[rlRemaining])
reset = parseInt(resp.headers[rlReset])
account.setRateLimit(api, remaining, reset)
template fetchImpl(result, additional_headers, fetchBody) {.dirty.} =
template fetchImpl(result, fetchBody) {.dirty.} =
once:
pool = HttpPool()
var account = await getGuestAccount(api)
if account.oauthToken.len == 0:
echo "[accounts] Empty oauth token, account: ", account.id
var session = await getSession(api)
if session.oauthToken.len == 0:
echo "[sessions] Empty oauth token, session: ", session.id
raise rateLimitError()
try:
var resp: AsyncResponse
var headers = genHeaders($url, account.oauthToken, account.oauthSecret)
for key, value in additional_headers.pairs():
headers.add(key, value)
pool.use(headers):
pool.use(genHeaders($url, session.oauthToken, session.oauthSecret)):
template getContent =
resp = await c.get($url)
result = await resp.body
getContent()
if resp.status == $Http429:
raise rateLimitError()
if resp.status == $Http503:
badClient = true
raise newException(BadClientError, "Bad client")
@ -101,7 +69,7 @@ template fetchImpl(result, additional_headers, fetchBody) {.dirty.} =
let
remaining = parseInt(resp.headers[rlRemaining])
reset = parseInt(resp.headers[rlReset])
account.setRateLimit(api, remaining, reset)
session.setRateLimit(api, remaining, reset)
if result.len > 0:
if resp.headers.getOrDefault("content-encoding") == "gzip":
@ -109,23 +77,25 @@ template fetchImpl(result, additional_headers, fetchBody) {.dirty.} =
if result.startsWith("{\"errors"):
let errors = result.fromJson(Errors)
if errors in {expiredToken, badToken}:
echo "fetch error: ", errors
invalidate(account)
raise rateLimitError()
elif errors in {rateLimited}:
# rate limit hit, resets after 24 hours
setLimited(account, api)
raise rateLimitError()
if errors notin errorsToSkip:
echo "Fetch error, API: ", api, ", errors: ", errors
if errors in {expiredToken, badToken, locked}:
invalidate(session)
raise rateLimitError()
elif errors in {rateLimited}:
# rate limit hit, resets after 24 hours
setLimited(session, api)
raise rateLimitError()
elif result.startsWith("429 Too Many Requests"):
echo "[accounts] 429 error, API: ", api, ", account: ", account.id
account.apis[api].remaining = 0
echo "[sessions] 429 error, API: ", api, ", session: ", session.id
session.apis[api].remaining = 0
# rate limit hit, resets after the 15 minute window
raise rateLimitError()
fetchBody
if resp.status == $Http400:
echo "ERROR 400, ", api, ": ", result
raise newException(InternalError, $url)
except InternalError as e:
raise e
@ -134,24 +104,23 @@ template fetchImpl(result, additional_headers, fetchBody) {.dirty.} =
except OSError as e:
raise e
except Exception as e:
let id = if account.isNil: "null" else: $account.id
echo "error: ", e.name, ", msg: ", e.msg, ", accountId: ", id, ", url: ", url
let id = if session.isNil: "null" else: $session.id
echo "error: ", e.name, ", msg: ", e.msg, ", sessionId: ", id, ", url: ", url
raise rateLimitError()
finally:
release(account)
release(session)
template retry(bod) =
try:
bod
except RateLimitError:
echo "[accounts] Rate limited, retrying ", api, " request..."
echo "[sessions] Rate limited, retrying ", api, " request..."
bod
proc fetch*(url: Uri; api: Api; additional_headers: HttpHeaders = newHttpHeaders()): Future[JsonNode] {.async.} =
proc fetch*(url: Uri; api: Api): Future[JsonNode] {.async.} =
retry:
var body: string
fetchImpl(body, additional_headers):
fetchImpl body:
if body.startsWith('{') or body.startsWith('['):
result = parseJson(body)
else:
@ -159,14 +128,15 @@ proc fetch*(url: Uri; api: Api; additional_headers: HttpHeaders = newHttpHeaders
result = newJNull()
let error = result.getError
if error in {expiredToken, badToken}:
echo "fetchBody error: ", error
invalidate(account)
raise rateLimitError()
if error != null and error notin errorsToSkip:
echo "Fetch error, API: ", api, ", error: ", error
if error in {expiredToken, badToken, locked}:
invalidate(session)
raise rateLimitError()
proc fetchRaw*(url: Uri; api: Api; additional_headers: HttpHeaders = newHttpHeaders()): Future[string] {.async.} =
proc fetchRaw*(url: Uri; api: Api): Future[string] {.async.} =
retry:
fetchImpl(result, additional_headers):
fetchImpl result:
if not (result.startsWith('{') or result.startsWith('[')):
echo resp.status, ": ", result, " --- url: ", url
result.setLen(0)

View File

@ -1,16 +1,15 @@
#SPDX-License-Identifier: AGPL-3.0-only
import std/[asyncdispatch, times, json, random, sequtils, strutils, tables, packedsets, os]
import types
import experimental/parser/guestaccount
import experimental/parser/session
# max requests at a time per account to avoid race conditions
# max requests at a time per session to avoid race conditions
const
maxConcurrentReqs = 2
dayInSeconds = 24 * 60 * 60
hourInSeconds = 60 * 60
apiMaxReqs: Table[Api, int] = {
Api.search: 50,
Api.tweetDetail: 150,
Api.photoRail: 180,
Api.tweetDetail: 500,
Api.userTweets: 500,
Api.userTweetsAndReplies: 500,
Api.userMedia: 500,
@ -24,23 +23,16 @@ const
}.toTable
var
accountPool: seq[GuestAccount]
sessionPool: seq[Session]
enableLogging = false
template log(str: varargs[string, `$`]) =
if enableLogging: echo "[accounts] ", str.join("")
echo "[sessions] ", str.join("")
proc snowflakeToEpoch(flake: int64): int64 =
int64(((flake shr 22) + 1288834974657) div 1000)
proc hasExpired(account: GuestAccount): bool =
let
created = snowflakeToEpoch(account.id)
now = epochTime().int64
daysOld = int(now - created) div dayInSeconds
return daysOld > 30
proc getAccountPoolHealth*(): JsonNode =
proc getSessionPoolHealth*(): JsonNode =
let now = epochTime().int
var
@ -51,38 +43,38 @@ proc getAccountPoolHealth*(): JsonNode =
newest = 0'i64
average = 0'i64
for account in accountPool:
let created = snowflakeToEpoch(account.id)
for session in sessionPool:
let created = snowflakeToEpoch(session.id)
if created > newest:
newest = created
if created < oldest:
oldest = created
average += created
for api in account.apis.keys:
if session.limited:
limited.incl session.id
for api in session.apis.keys:
let
apiStatus = account.apis[api]
apiStatus = session.apis[api]
reqs = apiMaxReqs[api] - apiStatus.remaining
if apiStatus.limited:
limited.incl account.id
# no requests made with this account and endpoint since the limit reset
# no requests made with this session and endpoint since the limit reset
if apiStatus.reset < now:
continue
reqsPerApi.mgetOrPut($api, 0).inc reqs
totalReqs.inc reqs
if accountPool.len > 0:
average = average div accountPool.len
if sessionPool.len > 0:
average = average div sessionPool.len
else:
oldest = 0
average = 0
return %*{
"accounts": %*{
"total": accountPool.len,
"sessions": %*{
"total": sessionPool.len,
"limited": limited.card,
"oldest": $fromUnix(oldest),
"newest": $fromUnix(newest),
@ -94,116 +86,117 @@ proc getAccountPoolHealth*(): JsonNode =
}
}
proc getAccountPoolDebug*(): JsonNode =
proc getSessionPoolDebug*(): JsonNode =
let now = epochTime().int
var list = newJObject()
for account in accountPool:
let accountJson = %*{
for session in sessionPool:
let sessionJson = %*{
"apis": newJObject(),
"pending": account.pending,
"pending": session.pending,
}
for api in account.apis.keys:
if session.limited:
sessionJson["limited"] = %true
for api in session.apis.keys:
let
apiStatus = account.apis[api]
apiStatus = session.apis[api]
obj = %*{}
if apiStatus.reset > now.int:
obj["remaining"] = %apiStatus.remaining
obj["reset"] = %apiStatus.reset
if "remaining" notin obj and not apiStatus.limited:
if "remaining" notin obj:
continue
if apiStatus.limited:
obj["limited"] = %true
accountJson{"apis", $api} = obj
list[$account.id] = accountJson
sessionJson{"apis", $api} = obj
list[$session.id] = sessionJson
return %list
proc rateLimitError*(): ref RateLimitError =
newException(RateLimitError, "rate limited")
proc isLimited(account: GuestAccount; api: Api): bool =
if account.isNil:
proc noSessionsError*(): ref NoSessionsError =
newException(NoSessionsError, "no sessions available")
proc isLimited(session: Session; api: Api): bool =
if session.isNil:
return true
if api in account.apis:
let limit = account.apis[api]
if session.limited and api != Api.userTweets:
if (epochTime().int - session.limitedAt) > hourInSeconds:
session.limited = false
log "resetting limit: ", session.id
return false
else:
return true
if limit.limited and (epochTime().int - limit.limitedAt) > dayInSeconds:
account.apis[api].limited = false
log "resetting limit, api: ", api, ", id: ", account.id
return limit.limited or (limit.remaining <= 10 and limit.reset > epochTime().int)
if api in session.apis:
let limit = session.apis[api]
return limit.remaining <= 10 and limit.reset > epochTime().int
else:
return false
proc isReady(account: GuestAccount; api: Api): bool =
not (account.isNil or account.pending > maxConcurrentReqs or account.isLimited(api))
proc isReady(session: Session; api: Api): bool =
not (session.isNil or session.pending > maxConcurrentReqs or session.isLimited(api))
proc invalidate*(account: var GuestAccount) =
if account.isNil: return
log "invalidating expired account: ", account.id
proc invalidate*(session: var Session) =
if session.isNil: return
log "invalidating: ", session.id
# TODO: This isn't sufficient, but it works for now
let idx = accountPool.find(account)
if idx > -1: accountPool.delete(idx)
account = nil
let idx = sessionPool.find(session)
if idx > -1: sessionPool.delete(idx)
session = nil
proc release*(account: GuestAccount) =
if account.isNil: return
dec account.pending
proc release*(session: Session) =
if session.isNil: return
dec session.pending
proc getGuestAccount*(api: Api): Future[GuestAccount] {.async.} =
for i in 0 ..< accountPool.len:
proc getSession*(api: Api): Future[Session] {.async.} =
for i in 0 ..< sessionPool.len:
if result.isReady(api): break
result = accountPool.sample()
result = sessionPool.sample()
if not result.isNil and result.isReady(api):
inc result.pending
else:
log "no accounts available for API: ", api
raise rateLimitError()
log "no sessions available for API: ", api
raise noSessionsError()
proc setLimited*(account: GuestAccount; api: Api) =
account.apis[api].limited = true
account.apis[api].limitedAt = epochTime().int
log "rate limited, api: ", api, ", reqs left: ", account.apis[api].remaining, ", id: ", account.id
proc setLimited*(session: Session; api: Api) =
session.limited = true
session.limitedAt = epochTime().int
log "rate limited by api: ", api, ", reqs left: ", session.apis[api].remaining, ", id: ", session.id
proc setRateLimit*(account: GuestAccount; api: Api; remaining, reset: int) =
proc setRateLimit*(session: Session; api: Api; remaining, reset: int) =
# avoid undefined behavior in race conditions
if api in account.apis:
let limit = account.apis[api]
if api in session.apis:
let limit = session.apis[api]
if limit.reset >= reset and limit.remaining < remaining:
return
if limit.reset == reset and limit.remaining >= remaining:
account.apis[api].remaining = remaining
session.apis[api].remaining = remaining
return
account.apis[api] = RateLimit(remaining: remaining, reset: reset)
session.apis[api] = RateLimit(remaining: remaining, reset: reset)
proc initAccountPool*(cfg: Config) =
let path = cfg.accountsFile
proc initSessionPool*(cfg: Config; path: string) =
enableLogging = cfg.enableDebug
if !path.endswith(".jsonl"):
log "Accounts file should be formated with JSONL"
if path.endsWith(".json"):
log "ERROR: .json is not supported, the file must be a valid JSONL file ending in .jsonl"
quit 1
if !fileExists(path):
log "Failed to access the accounts file (", path, ")"
if not fileExists(path):
log "ERROR: ", path, " not found. This file is required to authenticate API requests."
quit 1
log "Parsing JSONL accounts file: ", path
log "parsing JSONL account sessions file: ", path
for line in path.lines:
accountPool.add parseGuestAccount(line)
sessionPool.add parseSession(line)
let accountsPrePurge = accountPool.len
#accountPool.keepItIf(not it.hasExpired)
log "Successfully added ", accountPool.len, " valid accounts."
if accountsPrePurge > accountPool.len:
log "Purged ", accountsPrePurge - accountPool.len, " expired accounts."
log "successfully added ", sessionPool.len, " valid account sessions"

View File

@ -1,7 +1,6 @@
# SPDX-License-Identifier: AGPL-3.0-only
import parsecfg except Config
import types, strutils
from os import getEnv
proc get*[T](config: parseCfg.Config; section, key: string; default: T): T =
let val = config.getSectionValue(section, key)
@ -16,37 +15,32 @@ proc getConfig*(path: string): (Config, parseCfg.Config) =
let conf = Config(
# Server
address: cfg.get("server", "address", "0.0.0.0"),
port: cfg.get("server", "port", 8080),
useHttps: cfg.get("server", "https", true),
httpMaxConns: cfg.get("server", "httpMaxConnections", 100),
staticDir: cfg.get("server", "staticDir", "./public"),
accountsFile: cfg.get("server", "accountsFile", "./accounts.jsonl"),
title: cfg.get("server", "title", "Nitter"),
hostname: cfg.get("server", "hostname", "nitter.net"),
address: cfg.get("Server", "address", "0.0.0.0"),
port: cfg.get("Server", "port", 8080),
useHttps: cfg.get("Server", "https", true),
httpMaxConns: cfg.get("Server", "httpMaxConnections", 100),
staticDir: cfg.get("Server", "staticDir", "./public"),
title: cfg.get("Server", "title", "Nitter"),
hostname: cfg.get("Server", "hostname", "nitter.net"),
# Cache
listCacheTime: cfg.get("cache", "listMinutes", 120),
rssCacheTime: cfg.get("cache", "rssMinutes", 10),
listCacheTime: cfg.get("Cache", "listMinutes", 120),
rssCacheTime: cfg.get("Cache", "rssMinutes", 10),
redisHost: cfg.get("cache", "redisHost", "localhost"),
redisPort: cfg.get("cache", "redisPort", 6379),
redisConns: cfg.get("cache", "redisConnections", 20),
redisMaxConns: cfg.get("cache", "redisMaxConnections", 30),
redisPassword: cfg.get("cache", "redisPassword", ""),
redisHost: cfg.get("Cache", "redisHost", "localhost"),
redisPort: cfg.get("Cache", "redisPort", 6379),
redisConns: cfg.get("Cache", "redisConnections", 20),
redisMaxConns: cfg.get("Cache", "redisMaxConnections", 30),
redisPassword: cfg.get("Cache", "redisPassword", ""),
# Config
hmacKey: cfg.get("config", "hmacKey", "secretkey"),
base64Media: cfg.get("config", "base64Media", false),
minTokens: cfg.get("config", "tokenCount", 10),
enableRss: cfg.get("config", "enableRSS", true),
enableDebug: cfg.get("config", "enableDebug", false),
proxy: cfg.get("config", "proxy", ""),
proxyAuth: cfg.get("config", "proxyAuth", "")
hmacKey: cfg.get("Config", "hmacKey", "secretkey"),
base64Media: cfg.get("Config", "base64Media", false),
minTokens: cfg.get("Config", "tokenCount", 10),
enableRss: cfg.get("Config", "enableRSS", true),
enableDebug: cfg.get("Config", "enableDebug", false),
proxy: cfg.get("Config", "proxy", ""),
proxyAuth: cfg.get("Config", "proxyAuth", "")
)
return (conf, cfg)
let configPath = getEnv("NITTER_CONF_FILE", "./nitter.conf")
let (cfg*, fullCfg*) = getConfig(configPath)

View File

@ -1,56 +1,28 @@
# SPDX-License-Identifier: AGPL-3.0-only
import uri, sequtils, strutils
import uri, strutils
const
consumerKey* = "3nVuSoBZnx6U4vzUxf5w"
consumerSecret* = "Bcs59EFbbsdF6Sl9Ng71smgStWEGwXXKSjYvPVt7qys"
api = parseUri("https://api.twitter.com")
activate* = $(api / "1.1/guest/activate.json")
gql = parseUri("https://api.x.com") / "graphql"
photoRail* = api / "1.1/statuses/media_timeline.json"
timelineApi = api / "2/timeline"
graphql = api / "graphql"
graphUser* = graphql / "u7wQyGi6oExe8_TRWGMq4Q/UserResultByScreenNameQuery"
graphUserById* = graphql / "oPppcargziU1uDQHAUmH-A/UserResultByIdQuery"
graphUserTweets* = graphql / "3JNH4e9dq1BifLxAa3UMWg/UserWithProfileTweetsQueryV2"
graphUserTweetsAndReplies* = graphql / "8IS8MaO-2EN6GZZZb8jF0g/UserWithProfileTweetsAndRepliesQueryV2"
graphUserMedia* = graphql / "PDfFf8hGeJvUCiTyWtw4wQ/MediaTimelineV2"
graphTweet* = graphql / "q94uRCEn65LZThakYcPT6g/TweetDetail"
graphTweetResult* = graphql / "sITyJdhRPpvpEjg4waUmTA/TweetResultByIdQuery"
graphSearchTimeline* = graphql / "gkjsKepM6gl_HmFWoWKfgg/SearchTimeline"
graphListById* = graphql / "iTpgCtbdxrsJfyx0cFjHqg/ListByRestId"
graphListBySlug* = graphql / "-kmqNvm5Y-cVrfvBy6docg/ListBySlug"
graphListMembers* = graphql / "P4NpVZDqUD_7MEM84L-8nw/ListMembers"
graphListTweets* = graphql / "BbGLL1ZfMibdFNWlk7a0Pw/ListTimeline"
graphFavoriters* = graphql / "mDc_nU8xGv0cLRWtTaIEug/Favoriters"
graphRetweeters* = graphql / "RCR9gqwYD1NEgi9FWzA50A/Retweeters"
graphFollowers* = graphql / "EAqBhgcGr_qPOzhS4Q3scQ/Followers"
graphFollowing* = graphql / "JPZiqKjET7_M1r5Tlr8pyA/Following"
favorites* = graphql / "eSSNbhECHHWWALkkQq-YTA/Likes"
timelineParams* = {
"include_can_media_tag": "1",
"include_cards": "1",
"include_entities": "1",
"include_profile_interstitial_type": "0",
"include_quote_count": "0",
"include_reply_count": "0",
"include_user_entities": "0",
"include_ext_reply_count": "0",
"include_ext_media_color": "0",
"cards_platform": "Web-13",
"tweet_mode": "extended",
"send_error_codes": "1",
"simple_quoted_tweet": "1"
}.toSeq
graphUser* = gql / "u7wQyGi6oExe8_TRWGMq4Q/UserResultByScreenNameQuery"
graphUserById* = gql / "oPppcargziU1uDQHAUmH-A/UserResultByIdQuery"
graphUserTweets* = gql / "JLApJKFY0MxGTzCoK6ps8Q/UserWithProfileTweetsQueryV2"
graphUserTweetsAndReplies* = gql / "Y86LQY7KMvxn5tu3hFTyPg/UserWithProfileTweetsAndRepliesQueryV2"
graphUserMedia* = gql / "PDfFf8hGeJvUCiTyWtw4wQ/MediaTimelineV2"
graphTweet* = gql / "Vorskcd2tZ-tc4Gx3zbk4Q/ConversationTimelineV2"
graphTweetResult* = gql / "sITyJdhRPpvpEjg4waUmTA/TweetResultByIdQuery"
graphSearchTimeline* = gql / "KI9jCXUx3Ymt-hDKLOZb9Q/SearchTimeline"
graphListById* = gql / "oygmAig8kjn0pKsx_bUadQ/ListByRestId"
graphListBySlug* = gql / "88GTz-IPPWLn1EiU8XoNVg/ListBySlug"
graphListMembers* = gql / "kSmxeqEeelqdHSR7jMnb_w/ListMembers"
graphListTweets* = gql / "BbGLL1ZfMibdFNWlk7a0Pw/ListTimeline"
gqlFeatures* = """{
"android_graphql_skip_api_media_color_palette": false,
"blue_business_profile_image_shape_enabled": false,
"c9s_tweet_anatomy_moderator_badge_enabled": false,
"creator_subscriptions_subscription_count_enabled": false,
"creator_subscriptions_tweet_preview_api_enabled": true,
"freedom_of_speech_not_reach_fetch_enabled": false,
@ -72,7 +44,6 @@ const
"responsive_web_twitter_article_tweet_consumption_enabled": false,
"responsive_web_twitter_blue_verified_badge_is_enabled": true,
"rweb_lists_timeline_redesign_enabled": true,
"rweb_video_timestamps_enabled": true,
"spaces_2022_h2_clipping": true,
"spaces_2022_h2_spaces_communities": true,
"standardized_nudges_misinfo": false,
@ -89,7 +60,23 @@ const
"unified_cards_ad_metadata_container_dynamic_card_content_query_enabled": false,
"verified_phone_label_enabled": false,
"vibe_api_enabled": false,
"view_counts_everywhere_api_enabled": false
"view_counts_everywhere_api_enabled": false,
"premium_content_api_read_enabled": false,
"communities_web_enable_tweet_community_results_fetch": false,
"responsive_web_jetfuel_frame": false,
"responsive_web_grok_analyze_button_fetch_trends_enabled": false,
"responsive_web_grok_image_annotation_enabled": false,
"rweb_tipjar_consumption_enabled": false,
"profile_label_improvements_pcf_label_in_post_enabled": false,
"creator_subscriptions_quote_tweet_preview_enabled": false,
"c9s_tweet_anatomy_moderator_badge_enabled": false,
"responsive_web_grok_analyze_post_followups_enabled": false,
"rweb_video_timestamps_enabled": false,
"responsive_web_grok_share_attachment_enabled": false,
"articles_preview_enabled": false,
"immersive_video_status_linkable_timestamps": false,
"articles_api_enabled": false,
"responsive_web_grok_analysis_button_from_backend": false
}""".replace(" ", "").replace("\n", "")
tweetVariables* = """{
@ -123,15 +110,3 @@ const
"rest_id": "$1", $2
"count": 20
}"""
reactorsVariables* = """{
"tweetId" : "$1", $2
"count" : 20,
"includePromotedContent": false
}"""
followVariables* = """{
"userId" : "$1", $2
"count" : 20,
"includePromotedContent": false
}"""

View File

@ -1,21 +0,0 @@
import std/strutils
import jsony
import ../types/guestaccount
from ../../types import GuestAccount
proc toGuestAccount(account: RawAccount): GuestAccount =
let id = account.oauthToken[0 ..< account.oauthToken.find('-')]
result = GuestAccount(
id: parseBiggestInt(id),
oauthToken: account.oauthToken,
oauthSecret: account.oauthTokenSecret
)
proc parseGuestAccount*(raw: string): GuestAccount =
let rawAccount = raw.fromJson(RawAccount)
result = rawAccount.toGuestAccount
proc parseGuestAccounts*(path: string): seq[GuestAccount] =
let rawAccounts = readFile(path).fromJson(seq[RawAccount])
for account in rawAccounts:
result.add account.toGuestAccount

View File

@ -0,0 +1,15 @@
import std/strutils
import jsony
import ../types/session
from ../../types import Session
proc parseSession*(raw: string): Session =
let
session = raw.fromJson(RawSession)
id = session.oauthToken[0 ..< session.oauthToken.find('-')]
result = Session(
id: parseBiggestInt(id),
oauthToken: session.oauthToken,
oauthSecret: session.oauthTokenSecret
)

View File

@ -1,4 +1,4 @@
type
RawAccount* = object
RawSession* = object
oauthToken*: string
oauthTokenSecret*: string

View File

@ -11,6 +11,8 @@ const
let
twRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?twitter\.com"
twLinkRegex = re"""<a href="https:\/\/twitter.com([^"]+)">twitter\.com(\S+)</a>"""
xRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?x\.com"
xLinkRegex = re"""<a href="https:\/\/x.com([^"]+)">x\.com(\S+)</a>"""
ytRegex = re(r"([A-z.]+\.)?youtu(be\.com|\.be)", {reStudy, reIgnoreCase})
@ -56,12 +58,18 @@ proc replaceUrls*(body: string; prefs: Prefs; absolute=""): string =
if prefs.replaceYouTube.len > 0 and "youtu" in result:
result = result.replace(ytRegex, prefs.replaceYouTube)
if prefs.replaceTwitter.len > 0 and ("twitter.com" in body or tco in body):
result = result.replace(tco, https & prefs.replaceTwitter & "/t.co")
result = result.replace(cards, prefs.replaceTwitter & "/cards")
result = result.replace(twRegex, prefs.replaceTwitter)
result = result.replacef(twLinkRegex, a(
prefs.replaceTwitter & "$2", href = https & prefs.replaceTwitter & "$1"))
if prefs.replaceTwitter.len > 0:
if tco in result:
result = result.replace(tco, https & prefs.replaceTwitter & "/t.co")
if "x.com" in result:
result = result.replace(xRegex, prefs.replaceTwitter)
result = result.replacef(xLinkRegex, a(
prefs.replaceTwitter & "$2", href = https & prefs.replaceTwitter & "$1"))
if "twitter.com" in result:
result = result.replace(cards, prefs.replaceTwitter & "/cards")
result = result.replace(twRegex, prefs.replaceTwitter)
result = result.replacef(twLinkRegex, a(
prefs.replaceTwitter & "$2", href = https & prefs.replaceTwitter & "$1"))
if prefs.replaceReddit.len > 0 and ("reddit.com" in result or "redd.it" in result):
result = result.replace(rdShortRegex, prefs.replaceReddit & "/comments/")
@ -82,6 +90,8 @@ proc proxifyVideo*(manifest: string; proxy: bool): string =
for line in manifest.splitLines:
let url =
if line.startsWith("#EXT-X-MAP:URI"): line[16 .. ^2]
elif line.startsWith("#EXT-X-MEDIA") and "URI=" in line:
line[line.find("URI=") + 5 .. -1 + line.find("\"", start= 5 + line.find("URI="))]
else: line
if url.startsWith('/'):
let path = "https://video.twimg.com" & url

View File

@ -1,6 +1,5 @@
# SPDX-License-Identifier: AGPL-3.0-only
import asyncdispatch, strformat, logging
import config
from net import Port
from htmlgen import a
from os import getEnv
@ -8,15 +7,18 @@ from os import getEnv
import jester
import types, config, prefs, formatters, redis_cache, http_pool, auth
import views/[general, about]
import views/[general]
import routes/[
preferences, timeline, status, media, search, rss, list, debug,
unsupported, embed, resolver, router_utils]
const instancesUrl = "https://github.com/zedeus/nitter/wiki/Instances"
const issuesUrl = "https://github.com/zedeus/nitter/issues"
let
configPath = getEnv("NITTER_CONF_FILE", "./nitter.conf")
(cfg, fullCfg) = getConfig(configPath)
initAccountPool(cfg)
sessionsPath = getEnv("NITTER_SESSIONS_FILE", "./sessions.jsonl")
initSessionPool(cfg, sessionsPath)
if not cfg.enableDebug:
# Silence Jester's query warning
@ -32,7 +34,6 @@ setHmacKey(cfg.hmacKey)
setProxyEncoding(cfg.base64Media)
setMaxHttpConns(cfg.httpMaxConns)
setHttpProxy(cfg.proxy, cfg.proxyAuth)
initAboutPage(cfg.staticDir)
waitFor initRedisPool(cfg)
stdout.write &"Connected to Redis at {cfg.redisHost}:{cfg.redisPort}\n"
@ -60,15 +61,6 @@ routes:
get "/":
resp renderMain(renderSearch(), request, cfg, themePrefs())
get "/about":
resp renderMain(renderAbout(), request, cfg, themePrefs())
get "/explore":
redirect("/about")
get "/help":
redirect("/about")
get "/i/redirect":
let url = decodeUrl(@"url")
if url.len == 0: resp Http404
@ -79,18 +71,21 @@ routes:
error InternalError:
echo error.exc.name, ": ", error.exc.msg
const link = a("open a GitHub issue", href = issuesUrl)
const link = a("ngn@ngn.tf", href = "mailto:ngn@ngn.tf")
resp Http500, showError(
&"An error occurred, please {link} with the URL you tried to visit.", cfg)
&"An error occurred, please report to {link}", cfg)
error BadClientError:
echo error.exc.name, ": ", error.exc.msg
resp Http500, showError("Network error occurred, please try again.", cfg)
error RateLimitError:
const link = a("another instance", href = instancesUrl)
resp Http429, showError(
&"Instance has been rate limited.<br>Use {link} or try again later.", cfg)
&"Instance has been rate limited.", cfg)
error NoSessionsError:
resp Http429, showError(
&"Instance has no auth tokens, or is fully rate limited.", cfg)
extend rss, ""
extend status, ""

View File

@ -3,7 +3,6 @@ import strutils, options, times, math
import packedjson, packedjson/deserialiser
import types, parserutils, utils
import experimental/parser/unifiedcard
import std/tables
proc parseGraphTweet(js: JsonNode; isLegacy=false): Tweet
@ -33,8 +32,7 @@ proc parseGraphUser(js: JsonNode): User =
var user = js{"user_result", "result"}
if user.isNull:
user = ? js{"user_results", "result"}
result = parseUser(user{"legacy"})
result = parseUser(user{"legacy"}, user{"rest_id"}.getStr)
if result.verifiedType == VerifiedType.none and user{"is_blue_verified"}.getBool(false):
result.verifiedType = blue
@ -238,11 +236,8 @@ proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet =
# graphql
with rt, js{"retweeted_status_result", "result"}:
# needed due to weird edgecase where the actual tweet data isn't included
var rt_tweet = rt
if "tweet" in rt:
rt_tweet = rt{"tweet"}
if "legacy" in rt_tweet:
result.retweet = some parseGraphTweet(rt_tweet)
if "legacy" in rt:
result.retweet = some parseGraphTweet(rt)
return
if jsCard.kind != JNull:
@ -294,138 +289,6 @@ proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet =
result.text.removeSuffix(" Learn more.")
result.available = false
proc parseLegacyTweet(js: JsonNode): Tweet =
result = parseTweet(js, js{"card"})
if not result.isNil and result.available:
result.user = parseUser(js{"user"})
if result.quote.isSome:
result.quote = some parseLegacyTweet(js{"quoted_status"})
proc parseTweetSearch*(js: JsonNode; after=""): Timeline =
result.beginning = after.len == 0
if js.kind == JNull or "modules" notin js or js{"modules"}.len == 0:
return
for item in js{"modules"}:
with tweet, item{"status", "data"}:
let parsed = parseLegacyTweet(tweet)
if parsed.retweet.isSome:
parsed.retweet = some parseLegacyTweet(tweet{"retweeted_status"})
result.content.add @[parsed]
if result.content.len > 0:
result.bottom = $(result.content[^1][0].id - 1)
proc finalizeTweet(global: GlobalObjects; id: string): Tweet =
let intId = if id.len > 0: parseBiggestInt(id) else: 0
result = global.tweets.getOrDefault(id, Tweet(id: intId))
if result.quote.isSome:
let quote = get(result.quote).id
if $quote in global.tweets:
result.quote = some global.tweets[$quote]
else:
result.quote = some Tweet()
if result.retweet.isSome:
let rt = get(result.retweet).id
if $rt in global.tweets:
result.retweet = some finalizeTweet(global, $rt)
else:
result.retweet = some Tweet()
proc parsePin(js: JsonNode; global: GlobalObjects): Tweet =
let pin = js{"pinEntry", "entry", "entryId"}.getStr
if pin.len == 0: return
let id = pin.getId
if id notin global.tweets: return
global.tweets[id].pinned = true
return finalizeTweet(global, id)
proc parseGlobalObjects(js: JsonNode): GlobalObjects =
result = GlobalObjects()
let
tweets = ? js{"globalObjects", "tweets"}
users = ? js{"globalObjects", "users"}
for k, v in users:
result.users[k] = parseUser(v, k)
for k, v in tweets:
var tweet = parseTweet(v, v{"card"})
if tweet.user.id in result.users:
tweet.user = result.users[tweet.user.id]
result.tweets[k] = tweet
proc parseInstructions(res: var Profile; global: GlobalObjects; js: JsonNode) =
if js.kind != JArray or js.len == 0:
return
for i in js:
if res.tweets.beginning and i{"pinEntry"}.notNull:
with pin, parsePin(i, global):
res.pinned = some pin
with r, i{"replaceEntry", "entry"}:
if "top" in r{"entryId"}.getStr:
res.tweets.top = r.getCursor
elif "bottom" in r{"entryId"}.getStr:
res.tweets.bottom = r.getCursor
proc parseTimeline*(js: JsonNode; after=""): Profile =
result = Profile(tweets: Timeline(beginning: after.len == 0))
let global = parseGlobalObjects(? js)
let instructions = ? js{"timeline", "instructions"}
if instructions.len == 0: return
result.parseInstructions(global, instructions)
var entries: JsonNode
for i in instructions:
if "addEntries" in i:
entries = i{"addEntries", "entries"}
for e in ? entries:
let entry = e{"entryId"}.getStr
if "tweet" in entry or entry.startsWith("sq-I-t") or "tombstone" in entry:
let tweet = finalizeTweet(global, e.getEntryId)
if not tweet.available: continue
result.tweets.content.add tweet
elif "cursor-top" in entry:
result.tweets.top = e.getCursor
elif "cursor-bottom" in entry:
result.tweets.bottom = e.getCursor
elif entry.startsWith("sq-cursor"):
with cursor, e{"content", "operation", "cursor"}:
if cursor{"cursorType"}.getStr == "Bottom":
result.tweets.bottom = cursor{"value"}.getStr
else:
result.tweets.top = cursor{"value"}.getStr
proc parsePhotoRail*(js: JsonNode): PhotoRail =
with error, js{"error"}:
if error.getStr == "Not authorized.":
return
for tweet in js:
let
t = parseTweet(tweet, js{"tweet_card"})
url = if t.photos.len > 0: t.photos[0]
elif t.video.isSome: get(t.video).thumb
elif t.gif.isSome: get(t.gif).thumb
elif t.card.isSome: get(t.card).image
else: ""
if url.len == 0: continue
result.add GalleryPhoto(url: url, tweetId: $t.id)
proc parseGraphTweet(js: JsonNode; isLegacy=false): Tweet =
if js.kind == JNull:
return Tweet()
@ -473,7 +336,7 @@ proc parseGraphThread(js: JsonNode): tuple[thread: Chain; self: bool] =
let cursor = t{"item", "content", "value"}
result.thread.cursor = cursor.getStr
result.thread.hasMore = true
elif "tweet" in entryId:
elif "tweet" in entryId and "promoted" notin entryId:
let
isLegacy = t{"item"}.hasKey("itemContent")
(contentKey, resultKey) = if isLegacy: ("itemContent", "tweet_results")
@ -489,54 +352,60 @@ proc parseGraphTweetResult*(js: JsonNode): Tweet =
with tweet, js{"data", "tweet_result", "result"}:
result = parseGraphTweet(tweet, false)
proc parseGraphConversation*(js: JsonNode; tweetId: string): Conversation =
proc parseGraphConversation*(js: JsonNode; tweetId: string; v2=true): Conversation =
result = Conversation(replies: Result[Chain](beginning: true))
let instructions = ? js{"data", "threaded_conversation_with_injections_v2", "instructions"}
let
rootKey = if v2: "timeline_response" else: "threaded_conversation_with_injections_v2"
contentKey = if v2: "content" else: "itemContent"
resultKey = if v2: "tweetResult" else: "tweet_results"
let instructions = ? js{"data", rootKey, "instructions"}
if instructions.len == 0:
return
for e in instructions[0]{"entries"}:
let entryId = e{"entryId"}.getStr
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}:
let tweet = parseGraphTweet(tweetResult, true)
for i in instructions:
if i{"__typename"}.getStr == "TimelineAddEntries":
for e in i{"entries"}:
let entryId = e{"entryId"}.getStr
if entryId.startsWith("tweet"):
with tweetResult, e{"content", contentKey, resultKey, "result"}:
let tweet = parseGraphTweet(tweetResult, not v2)
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
if $tweet.id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("tombstone"):
let id = entryId.getId()
let tweet = Tweet(
id: parseBiggestInt(id),
available: false,
text: e{"content", "itemContent", "tombstoneInfo", "richText"}.getTombstone
)
if $tweet.id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("conversationthread"):
let (thread, self) = parseGraphThread(e)
if self:
result.after = thread
elif thread.content.len > 0:
result.replies.content.add thread
elif entryId.startsWith("tombstone"):
let id = entryId.getId()
let tweet = Tweet(
id: parseBiggestInt(id),
available: false,
text: e{"content", contentKey, "tombstoneInfo", "richText"}.getTombstone
)
if id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("conversationthread"):
let (thread, self) = parseGraphThread(e)
if self:
result.after = thread
else:
result.replies.content.add thread
elif entryId.startsWith("cursor-bottom"):
result.replies.bottom = e{"content", "itemContent", "value"}.getStr
if id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("cursor-bottom"):
result.replies.bottom = e{"content", contentKey, "value"}.getStr
proc parseGraphTimeline*(js: JsonNode; root: string; after=""): Profile =
result = Profile(tweets: Timeline(beginning: after.len == 0))
let instructions =
if root == "list": ? js{"data", "list", "timeline_response", "timeline", "instructions"}
elif root == "user": ? js{"data", "user_result", "result", "timeline_response", "timeline", "instructions"}
else: ? js{"data", "user", "result", "timeline", "timeline", "instructions"}
else: ? js{"data", "user_result", "result", "timeline_response", "timeline", "instructions"}
if instructions.len == 0:
return
@ -556,21 +425,6 @@ proc parseGraphTimeline*(js: JsonNode; root: string; after=""): Profile =
result.tweets.content.add thread.content
elif entryId.startsWith("cursor-bottom"):
result.tweets.bottom = e{"content", "value"}.getStr
# TODO cleanup
if i{"type"}.getStr == "TimelineAddEntries":
for e in i{"entries"}:
let entryId = e{"entryId"}.getStr
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}:
let tweet = parseGraphTweet(tweetResult, false)
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
result.tweets.content.add tweet
elif "-conversation-" in entryId or entryId.startsWith("homeConversation"):
let (thread, self) = parseGraphThread(e)
result.tweets.content.add thread.content
elif entryId.startsWith("cursor-bottom"):
result.tweets.bottom = e{"content", "value"}.getStr
if after.len == 0 and i{"__typename"}.getStr == "TimelinePinEntry":
with tweetResult, i{"entry", "content", "content", "tweetResult", "result"}:
let tweet = parseGraphTweet(tweetResult, false)
@ -581,35 +435,34 @@ proc parseGraphTimeline*(js: JsonNode; root: string; after=""): Profile =
tweet.id = parseBiggestInt(entryId)
result.pinned = some tweet
proc parseGraphUsersTimeline(timeline: JsonNode; after=""): UsersTimeline =
result = UsersTimeline(beginning: after.len == 0)
proc parseGraphPhotoRail*(js: JsonNode): PhotoRail =
result = @[]
let instructions = ? timeline{"instructions"}
if instructions.len == 0:
return
let instructions =
? js{"data", "user_result", "result", "timeline_response", "timeline", "instructions"}
for i in instructions:
if i{"type"}.getStr == "TimelineAddEntries":
if i{"__typename"}.getStr == "TimelineAddEntries":
for e in i{"entries"}:
let entryId = e{"entryId"}.getStr
if entryId.startsWith("user"):
with graphUser, e{"content", "itemContent"}:
let user = parseGraphUser(graphUser)
result.content.add user
elif entryId.startsWith("cursor-bottom"):
result.bottom = e{"content", "value"}.getStr
elif entryId.startsWith("cursor-top"):
result.top = e{"content", "value"}.getStr
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "content", "tweetResult", "result"}:
let t = parseGraphTweet(tweetResult, false)
if not t.available:
t.id = parseBiggestInt(entryId.getId())
proc parseGraphFavoritersTimeline*(js: JsonNode; root: string; after=""): UsersTimeline =
return parseGraphUsersTimeline(js{"data", "favoriters_timeline", "timeline"}, after)
let url =
if t.photos.len > 0: t.photos[0]
elif t.video.isSome: get(t.video).thumb
elif t.gif.isSome: get(t.gif).thumb
elif t.card.isSome: get(t.card).image
else: ""
proc parseGraphRetweetersTimeline*(js: JsonNode; root: string; after=""): UsersTimeline =
return parseGraphUsersTimeline(js{"data", "retweeters_timeline", "timeline"}, after)
if url.len > 0:
result.add GalleryPhoto(url: url, tweetId: $t.id)
proc parseGraphFollowTimeline*(js: JsonNode; root: string; after=""): UsersTimeline =
return parseGraphUsersTimeline(js{"data", "user", "result", "timeline", "timeline"}, after)
if result.len == 16:
break
proc parseGraphSearch*[T: User | Tweets](js: JsonNode; after=""): Result[T] =
result = Result[T](beginning: after.len == 0)

View File

@ -40,13 +40,6 @@ proc getMediaQuery*(name: string): Query =
sep: "OR"
)
proc getFavoritesQuery*(name: string): Query =
Query(
kind: favorites,
fromUser: @[name]
)
proc getReplyQuery*(name: string): Query =
Query(
kind: replies,

View File

@ -86,7 +86,7 @@ proc cache*(data: List) {.async.} =
await setEx(data.listKey, listCacheTime, compress(toFlatty(data)))
proc cache*(data: PhotoRail; name: string) {.async.} =
await setEx("pr:" & toLower(name), baseCacheTime * 2, compress(toFlatty(data)))
await setEx("pr2:" & toLower(name), baseCacheTime * 2, compress(toFlatty(data)))
proc cache*(data: User) {.async.} =
if data.username.len == 0: return
@ -158,14 +158,14 @@ proc getCachedUsername*(userId: string): Future[string] {.async.} =
# if not result.isNil:
# await cache(result)
proc getCachedPhotoRail*(name: string): Future[PhotoRail] {.async.} =
if name.len == 0: return
let rail = await get("pr:" & toLower(name))
proc getCachedPhotoRail*(id: string): Future[PhotoRail] {.async.} =
if id.len == 0: return
let rail = await get("pr2:" & toLower(id))
if rail != redisNil:
rail.deserialize(PhotoRail)
else:
result = await getPhotoRail(name)
await cache(result, name)
result = await getPhotoRail(id)
await cache(result, id)
proc getCachedList*(username=""; slug=""; id=""): Future[List] {.async.} =
let list = if id.len == 0: redisNil

View File

@ -6,8 +6,8 @@ import ".."/[auth, types]
proc createDebugRouter*(cfg: Config) =
router debug:
get "/.health":
respJson getAccountPoolHealth()
respJson getSessionPoolHealth()
get "/.accounts":
get "/.sessions":
cond cfg.enableDebug
respJson getAccountPoolDebug()
respJson getSessionPoolDebug()

View File

@ -37,7 +37,8 @@ proc proxyMedia*(req: jester.Request; url: string): Future[HttpCode] {.async.} =
try:
let res = await client.get(url)
if res.status != "200 OK":
echo "[media] Proxying failed, status: $1, url: $2" % [res.status, url]
if res.status != "404 Not Found":
echo "[media] Proxying failed, status: $1, url: $2" % [res.status, url]
return Http404
let hashed = $hash(url)
@ -122,7 +123,7 @@ proc createMediaRouter*(cfg: Config) =
cond "http" in url
if getHmac(url) != request.matches[1]:
resp showError("Failed to verify signature", cfg)
resp Http403, showError("Failed to verify signature", cfg)
if ".mp4" in url or ".ts" in url or ".m4s" in url:
let code = await proxyMedia(request, url)

View File

@ -23,7 +23,7 @@ proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.
names = getNames(name)
if names.len == 1:
profile = await fetchProfile(after, query, cfg, skipRail=true, skipPinned=true)
profile = await fetchProfile(after, query, skipRail=true, skipPinned=true)
else:
var q = query
q.fromUser = names
@ -102,7 +102,7 @@ proc createRssRouter*(cfg: Config) =
get "/@name/@tab/rss":
cond cfg.enableRss
cond '.' notin @"name"
cond @"tab" in ["with_replies", "media", "favorites", "search"]
cond @"tab" in ["with_replies", "media", "search"]
let
name = @"name"
tab = @"tab"
@ -110,7 +110,6 @@ proc createRssRouter*(cfg: Config) =
case tab
of "with_replies": getReplyQuery(name)
of "media": getMediaQuery(name)
of "favorites": getFavoritesQuery(name)
of "search": initQuery(params(request), name=name)
else: Query(fromUser: @[name])

View File

@ -5,7 +5,7 @@ import jester, karax/vdom
import router_utils
import ".."/[types, formatters, api]
import ../views/[general, status, search]
import ../views/[general, status]
export uri, sequtils, options, sugar
export router_utils
@ -14,29 +14,6 @@ export status
proc createStatusRouter*(cfg: Config) =
router status:
get "/@name/status/@id/@reactors":
cond '.' notin @"name"
let id = @"id"
if id.len > 19 or id.any(c => not c.isDigit):
resp Http404, showError("Invalid tweet ID", cfg)
let prefs = cookiePrefs()
# used for the infinite scroll feature
if @"scroll".len > 0:
let replies = await getReplies(id, getCursor())
if replies.content.len == 0:
resp Http404, ""
resp $renderReplies(replies, prefs, getPath())
if @"reactors" == "favoriters":
resp renderMain(renderUserList(await getGraphFavoriters(id, getCursor()), prefs),
request, cfg, prefs)
elif @"reactors" == "retweeters":
resp renderMain(renderUserList(await getGraphRetweeters(id, getCursor()), prefs),
request, cfg, prefs)
get "/@name/status/@id/?":
cond '.' notin @"name"
let id = @"id"

View File

@ -16,7 +16,6 @@ proc getQuery*(request: Request; tab, name: string): Query =
case tab
of "with_replies": getReplyQuery(name)
of "media": getMediaQuery(name)
of "favorites": getFavoritesQuery(name)
of "search": initQuery(params(request), name=name)
else: Query(fromUser: @[name])
@ -28,7 +27,7 @@ template skipIf[T](cond: bool; default; body: Future[T]): Future[T] =
else:
body
proc fetchProfile*(after: string; query: Query; cfg: Config; skipRail=false;
proc fetchProfile*(after: string; query: Query; skipRail=false;
skipPinned=false): Future[Profile] {.async.} =
let
name = query.fromUser[0]
@ -48,7 +47,7 @@ proc fetchProfile*(after: string; query: Query; cfg: Config; skipRail=false;
let
rail =
skipIf(skipRail or query.kind == media, @[]):
getCachedPhotoRail(name)
getCachedPhotoRail(userId)
user = getCachedUser(name)
@ -57,7 +56,6 @@ proc fetchProfile*(after: string; query: Query; cfg: Config; skipRail=false;
of posts: await getGraphUserTweets(userId, TimelineKind.tweets, after)
of replies: await getGraphUserTweets(userId, TimelineKind.replies, after)
of media: await getGraphUserTweets(userId, TimelineKind.media, after)
of favorites: await getFavorites(userId, cfg, after)
else: Profile(tweets: await getGraphTweetSearch(query, after))
result.user = await user
@ -73,7 +71,7 @@ proc showTimeline*(request: Request; query: Query; cfg: Config; prefs: Prefs;
html = renderTweetSearch(timeline, prefs, getPath())
return renderMain(html, request, cfg, prefs, "Multi", rss=rss)
var profile = await fetchProfile(after, query, cfg, skipPinned=prefs.hidePins)
var profile = await fetchProfile(after, query, skipPinned=prefs.hidePins)
template u: untyped = profile.user
if u.suspended:
@ -81,7 +79,7 @@ proc showTimeline*(request: Request; query: Query; cfg: Config; prefs: Prefs;
if profile.user.id.len == 0: return
let pHtml = renderProfile(profile, cfg, prefs, getPath())
let pHtml = renderProfile(profile, prefs, getPath())
result = renderMain(pHtml, request, cfg, prefs, pageTitle(u), pageDesc(u),
rss=rss, images = @[u.getUserPic("_400x400")],
banner=u.banner)
@ -111,42 +109,35 @@ proc createTimelineRouter*(cfg: Config) =
get "/@name/?@tab?/?":
cond '.' notin @"name"
cond @"name" notin ["pic", "gif", "video", "search", "settings", "login", "intent", "i"]
cond @"tab" in ["with_replies", "media", "search", "favorites", "following", "followers", ""]
cond @"tab" in ["with_replies", "media", "search", ""]
let
prefs = cookiePrefs()
after = getCursor()
names = getNames(@"name")
tab = @"tab"
case tab:
of "followers":
resp renderMain(renderUserList(await getGraphFollowers(await getUserId(@"name"), getCursor()), prefs), request, cfg, prefs)
of "following":
resp renderMain(renderUserList(await getGraphFollowing(await getUserId(@"name"), getCursor()), prefs), request, cfg, prefs)
var query = request.getQuery(@"tab", @"name")
if names.len != 1:
query.fromUser = names
# used for the infinite scroll feature
if @"scroll".len > 0:
if query.fromUser.len != 1:
var timeline = await getGraphTweetSearch(query, after)
if timeline.content.len == 0: resp Http404
timeline.beginning = true
resp $renderTweetSearch(timeline, prefs, getPath())
else:
var query = request.getQuery(@"tab", @"name")
if names.len != 1:
query.fromUser = names
var profile = await fetchProfile(after, query, skipRail=true)
if profile.tweets.content.len == 0: resp Http404
profile.tweets.beginning = true
resp $renderTimelineTweets(profile.tweets, prefs, getPath())
# used for the infinite scroll feature
if @"scroll".len > 0:
if query.fromUser.len != 1:
var timeline = await getGraphTweetSearch(query, after)
if timeline.content.len == 0: resp Http404
timeline.beginning = true
resp $renderTweetSearch(timeline, prefs, getPath())
else:
var profile = await fetchProfile(after, query, cfg, skipRail=true)
if profile.tweets.content.len == 0: resp Http404
profile.tweets.beginning = true
resp $renderTimelineTweets(profile.tweets, prefs, getPath())
let rss =
if @"tab".len == 0:
"/$1/rss" % @"name"
elif @"tab" == "search":
"/$1/search/rss?$2" % [@"name", genQueryUrl(query)]
else:
"/$1/$2/rss" % [@"name", @"tab"]
let rss =
if @"tab".len == 0:
"/$1/rss" % @"name"
elif @"tab" == "search":
"/$1/search/rss?$2" % [@"name", genQueryUrl(query)]
else:
"/$1/$2/rss" % [@"name", @"tab"]
respTimeline(await showTimeline(request, query, cfg, prefs, rss, after))
respTimeline(await showTimeline(request, query, cfg, prefs, rss, after))

View File

@ -12,11 +12,10 @@ proc createUnsupportedRouter*(cfg: Config) =
template feature {.dirty.} =
resp renderMain(renderFeature(), request, cfg, themePrefs())
get "/about/feature": feature()
get "/login/?@i?": feature()
get "/@name/lists/?": feature()
get "/intent/?@i?":
get "/intent/?@i?":
cond @"i" notin ["user"]
feature()

View File

@ -207,7 +207,6 @@
padding-top: 5px;
min-width: 1em;
margin-right: 0.8em;
pointer-events: all;
}
.show-thread {

View File

@ -16,6 +16,8 @@ video {
}
.video-container {
min-height: 80px;
min-width: 200px;
max-height: 530px;
margin: 0;
display: flex;

View File

@ -6,6 +6,7 @@ genPrefsType()
type
RateLimitError* = object of CatchableError
NoSessionsError* = object of CatchableError
InternalError* = object of CatchableError
BadClientError* = object of CatchableError
@ -15,7 +16,6 @@ type
Api* {.pure.} = enum
tweetDetail
tweetResult
photoRail
search
list
listBySlug
@ -23,26 +23,21 @@ type
listTweets
userRestId
userScreenName
favorites
userTweets
userTweetsAndReplies
userMedia
favoriters
retweeters
following
followers
RateLimit* = object
remaining*: int
reset*: int
limited*: bool
limitedAt*: int
GuestAccount* = ref object
Session* = ref object
id*: int64
oauthToken*: string
oauthSecret*: string
pending*: int
limited*: bool
limitedAt*: int
apis*: Table[Api, RateLimit]
Error* = enum
@ -50,8 +45,10 @@ type
noUserMatches = 17
protectedUser = 22
missingParams = 25
timeout = 29
couldntAuth = 32
doesntExist = 34
unauthorized = 37
invalidParam = 47
userNotFound = 50
suspended = 63
@ -61,7 +58,9 @@ type
tweetNotFound = 144
tweetNotAuthorized = 179
forbidden = 200
badRequest = 214
badToken = 239
locked = 326
noCsrf = 353
tweetUnavailable = 421
tweetCensored = 422
@ -116,7 +115,7 @@ type
variants*: seq[VideoVariant]
QueryKind* = enum
posts, replies, media, users, tweets, userList, favorites
posts, replies, media, users, tweets, userList
Query* = object
kind*: QueryKind
@ -236,7 +235,6 @@ type
replies*: Result[Chain]
Timeline* = Result[Tweets]
UsersTimeline* = Result[User]
Profile* = object
user*: User
@ -265,7 +263,6 @@ type
title*: string
hostname*: string
staticDir*: string
accountsFile*: string
hmacKey*: string
base64Media*: bool
@ -283,7 +280,6 @@ type
redisConns*: int
redisMaxConns*: int
redisPassword*: string
redisDb*: int
Rss* = object
feed*, cursor*: string

View File

@ -1,26 +0,0 @@
# SPDX-License-Identifier: AGPL-3.0-only
import os, strformat
import karax/[karaxdsl, vdom]
const
date = staticExec("git show -s --format=\"%cd\" --date=format:\"%Y.%m.%d\"")
hash = staticExec("git show -s --format=\"%h\"")
link = "https://github.com/zedeus/nitter/commit/" & hash
version = &"{date}-{hash}"
var aboutHtml: string
proc initAboutPage*(dir: string) =
try:
aboutHtml = readFile(dir/"md/about.html")
except IOError:
stderr.write (dir/"md/about.html") & " not found, please run `nimble md`\n"
aboutHtml = "<h1>About page is missing</h1><br><br>"
proc renderAbout*(): VNode =
buildHtml(tdiv(class="overlay-panel")):
verbatim aboutHtml
h2: text "Instance info"
p:
text "Version "
a(href=link): text version

View File

@ -6,9 +6,3 @@ proc renderFeature*(): VNode =
h1: text "Unsupported feature"
p:
text "Nitter doesn't support this feature yet, but it might in the future. "
text "You can check for an issue and open one if needed here: "
a(href="https://github.com/zedeus/nitter/issues"):
text "https://github.com/zedeus/nitter/issues"
p:
text "To find out more about the Nitter project, see the "
a(href="/about"): text "About page"

View File

@ -31,9 +31,7 @@ proc renderNavbar(cfg: Config; req: Request; rss, canonical: string): VNode =
icon "search", title="Search", href="/search"
if cfg.enableRss and rss.len > 0:
icon "rss-feed", title="RSS Feed", href=rss
icon "bird", title="Open in Twitter", href=canonical
a(href="https://liberapay.com/zedeus"): verbatim lp
icon "info", title="About", href="/about"
icon "cog", title="Preferences", href=("/settings?referer=" & encodeUrl(path))
proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
@ -42,7 +40,7 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
var theme = prefs.theme.toTheme
if "theme" in req.params:
theme = req.params["theme"].toTheme
let ogType =
if video.len > 0: "video"
elif rss.len > 0: "object"
@ -73,7 +71,7 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
link(rel="alternate", type="application/rss+xml", href=rss, title="RSS feed")
if prefs.hlsPlayback:
script(src="/js/hls.light.min.js", `defer`="")
script(src="/js/hls.min.js", `defer`="")
script(src="/js/hlsPlayback.js", `defer`="")
if prefs.infiniteScroll:

View File

@ -58,14 +58,10 @@ proc renderUserCard*(user: User; prefs: Prefs): VNode =
tdiv(class="profile-card-extra-links"):
ul(class="profile-statlist"):
a(href="/" & user.username):
renderStat(user.tweets, "posts", text="Tweets")
a(href="/" & user.username & "/following"):
renderStat(user.following, "following")
a(href="/" & user.username & "/followers"):
renderStat(user.followers, "followers")
a(href="/" & user.username & "/favorites"):
renderStat(user.likes, "likes")
renderStat(user.tweets, "posts", text="Tweets")
renderStat(user.following, "following")
renderStat(user.followers, "followers")
renderStat(user.likes, "likes")
proc renderPhotoRail(profile: Profile): VNode =
let count = insertSep($profile.user.media, ',')
@ -103,7 +99,7 @@ proc renderProtected(username: string): VNode =
h2: text "This account's tweets are protected."
p: text &"Only confirmed followers have access to @{username}'s tweets."
proc renderProfile*(profile: var Profile; cfg: Config; prefs: Prefs; path: string): VNode =
proc renderProfile*(profile: var Profile; prefs: Prefs; path: string): VNode =
profile.tweets.query.fromUser = @[profile.user.username]
buildHtml(tdiv(class="profile-tabs")):

View File

@ -91,7 +91,7 @@ proc genDate*(pref, state: string): VNode =
proc genImg*(url: string; class=""): VNode =
buildHtml():
img(src=getPicUrl(url), class=class, alt="")
img(src=getPicUrl(url), class=class, alt="", loading="lazy")
proc getTabClass*(query: Query; tab: QueryKind): string =
if query.kind == tab: "tab-item active"

View File

@ -3,7 +3,7 @@ import strutils, strformat, sequtils, unicode, tables, options
import karax/[karaxdsl, vdom]
import renderutils, timeline
import ".."/[types, query, config]
import ".."/[types, query]
const toggles = {
"nativeretweets": "Retweets",
@ -24,12 +24,12 @@ proc renderSearch*(): VNode =
buildHtml(tdiv(class="panel-container")):
tdiv(class="search-bar"):
form(`method`="get", action="/search", autocomplete="off"):
hiddenField("f", "users")
hiddenField("f", "tweets")
input(`type`="text", name="q", autofocus="",
placeholder="Enter username...", dir="auto")
placeholder="Search...", dir="auto")
button(`type`="submit"): icon "search"
proc renderProfileTabs*(query: Query; username: string; cfg: Config): VNode =
proc renderProfileTabs*(query: Query; username: string): VNode =
let link = "/" & username
buildHtml(ul(class="tab")):
li(class=query.getTabClass(posts)):
@ -38,8 +38,6 @@ proc renderProfileTabs*(query: Query; username: string; cfg: Config): VNode =
a(href=(link & "/with_replies")): text "Tweets & Replies"
li(class=query.getTabClass(media)):
a(href=(link & "/media")): text "Media"
li(class=query.getTabClass(favorites)):
a(href=(link & "/favorites")): text "Likes"
li(class=query.getTabClass(tweets)):
a(href=(link & "/search")): text "Search"
@ -99,7 +97,7 @@ proc renderTweetSearch*(results: Timeline; prefs: Prefs; path: string;
text query.fromUser.join(" | ")
if query.fromUser.len > 0:
renderProfileTabs(query, query.fromUser.join(","), cfg)
renderProfileTabs(query, query.fromUser.join(","))
if query.fromUser.len == 0 or query.kind == tweets:
tdiv(class="timeline-header"):
@ -120,8 +118,3 @@ proc renderUserSearch*(results: Result[User]; prefs: Prefs): VNode =
renderSearchTabs(results.query)
renderTimelineUsers(results, prefs)
proc renderUserList*(results: Result[User]; prefs: Prefs): VNode =
buildHtml(tdiv(class="timeline-container")):
tdiv(class="timeline-header")
renderTimelineUsers(results, prefs)

View File

@ -10,9 +10,7 @@ import general
const doctype = "<!DOCTYPE html>\n"
proc renderMiniAvatar(user: User; prefs: Prefs): VNode =
let url = getPicUrl(user.getUserPic("_mini"))
buildHtml():
img(class=(prefs.getAvatarClass & " mini"), src=url)
genImg(user.getUserPic("_mini"), class=(prefs.getAvatarClass & " mini"))
proc renderHeader(tweet: Tweet; retweet: string; pinned: bool; prefs: Prefs): VNode =
buildHtml(tdiv):
@ -92,10 +90,10 @@ proc renderVideo*(video: Video; prefs: Prefs; path: string): VNode =
tdiv(class="attachment video-container"):
let thumb = getSmallPic(video.thumb)
if not video.available:
img(src=thumb)
img(src=thumb, loading="lazy")
renderVideoUnavailable(video)
elif not prefs.isPlaybackEnabled(playbackType):
img(src=thumb)
img(src=thumb, loading="lazy")
renderVideoDisabled(playbackType, path)
else:
let
@ -144,7 +142,7 @@ proc renderPoll(poll: Poll): VNode =
proc renderCardImage(card: Card): VNode =
buildHtml(tdiv(class="card-image-container")):
tdiv(class="card-image"):
img(src=getPicUrl(card.image), alt="")
genImg(card.image)
if card.kind == player:
tdiv(class="card-overlay"):
tdiv(class="overlay-circle"):
@ -180,19 +178,14 @@ func formatStat(stat: int): string =
if stat > 0: insertSep($stat, ',')
else: ""
proc renderStats(stats: TweetStats; views: string; tweet: Tweet): VNode =
proc renderStats(stats: TweetStats; views: string): VNode =
buildHtml(tdiv(class="tweet-stats")):
a(href=getLink(tweet)):
span(class="tweet-stat"): icon "comment", formatStat(stats.replies)
a(href=getLink(tweet, false) & "/retweeters"):
span(class="tweet-stat"): icon "retweet", formatStat(stats.retweets)
a(href="/search?q=quoted_tweet_id:" & $tweet.id):
span(class="tweet-stat"): icon "quote", formatStat(stats.quotes)
a(href=getLink(tweet, false) & "/favoriters"):
span(class="tweet-stat"): icon "heart", formatStat(stats.likes)
a(href=getLink(tweet)):
if views.len > 0:
span(class="tweet-stat"): icon "play", insertSep(views, ',')
span(class="tweet-stat"): icon "comment", formatStat(stats.replies)
span(class="tweet-stat"): icon "retweet", formatStat(stats.retweets)
span(class="tweet-stat"): icon "quote", formatStat(stats.quotes)
span(class="tweet-stat"): icon "heart", formatStat(stats.likes)
if views.len > 0:
span(class="tweet-stat"): icon "play", insertSep(views, ',')
proc renderReply(tweet: Tweet): VNode =
buildHtml(tdiv(class="replying-to")):
@ -350,7 +343,7 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
renderMediaTags(tweet.mediaTags)
if not prefs.hideTweetStats:
renderStats(tweet.stats, views, tweet)
renderStats(tweet.stats, views)
if showThread:
a(class="show-thread", href=("/i/status/" & $tweet.threadId)):

View File

@ -26,8 +26,8 @@ no_thumb = [
'lnkd.in'],
['Thom_Wolf/status/1122466524860702729',
'facebookresearch/fairseq',
'Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - GitHub - facebookresearch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in Python.',
'GitHub - NVIDIA/Megatron-LM: Ongoing research training transformer models at scale',
'Ongoing research training transformer models at scale - NVIDIA/Megatron-LM',
'github.com'],
['brent_p/status/1088857328680488961',

View File

@ -4,7 +4,7 @@ from parameterized import parameterized
profiles = [
['mobile_test', 'Test account',
'Test Account. test test Testing username with @mobile_test_2 and a #hashtag',
'San Francisco, CA', 'example.com/foobar', 'Joined October 2009', '98'],
'San Francisco, CA', 'example.com/foobar', 'Joined October 2009', '97'],
['mobile_test_2', 'mobile test 2', '', '', '', 'Joined January 2011', '13']
]

152
tools/get_session.py Normal file
View File

@ -0,0 +1,152 @@
#!/usr/bin/env python3
import requests
import json
import sys
import pyotp
# NOTE: pyotp and requests are dependencies
# > pip install pyotp requests
TW_CONSUMER_KEY = '3nVuSoBZnx6U4vzUxf5w'
TW_CONSUMER_SECRET = 'Bcs59EFbbsdF6Sl9Ng71smgStWEGwXXKSjYvPVt7qys'
def auth(username, password, otp_secret):
bearer_token_req = requests.post("https://api.twitter.com/oauth2/token",
auth=(TW_CONSUMER_KEY, TW_CONSUMER_SECRET),
headers={"Content-Type": "application/x-www-form-urlencoded"},
data='grant_type=client_credentials'
).json()
bearer_token = ' '.join(str(x) for x in bearer_token_req.values())
guest_token = requests.post(
"https://api.twitter.com/1.1/guest/activate.json",
headers={'Authorization': bearer_token}
).json().get('guest_token')
if not guest_token:
print("Failed to obtain guest token.")
sys.exit(1)
twitter_header = {
'Authorization': bearer_token,
"Content-Type": "application/json",
"User-Agent": "TwitterAndroid/10.21.0-release.0 (310210000-r-0) ONEPLUS+A3010/9 (OnePlus;ONEPLUS+A3010;OnePlus;OnePlus3;0;;1;2016)",
"X-Twitter-API-Version": '5',
"X-Twitter-Client": "TwitterAndroid",
"X-Twitter-Client-Version": "10.21.0-release.0",
"OS-Version": "28",
"System-User-Agent": "Dalvik/2.1.0 (Linux; U; Android 9; ONEPLUS A3010 Build/PKQ1.181203.001)",
"X-Twitter-Active-User": "yes",
"X-Guest-Token": guest_token,
"X-Twitter-Client-DeviceID": ""
}
session = requests.Session()
session.headers = twitter_header
task1 = session.post(
'https://api.twitter.com/1.1/onboarding/task.json',
params={
'flow_name': 'login',
'api_version': '1',
'known_device_token': '',
'sim_country_code': 'us'
},
json={
"flow_token": None,
"input_flow_data": {
"country_code": None,
"flow_context": {
"referrer_context": {
"referral_details": "utm_source=google-play&utm_medium=organic",
"referrer_url": ""
},
"start_location": {
"location": "deeplink"
}
},
"requested_variant": None,
"target_user_id": 0
}
}
)
session.headers['att'] = task1.headers.get('att')
task2 = session.post(
'https://api.twitter.com/1.1/onboarding/task.json',
json={
"flow_token": task1.json().get('flow_token'),
"subtask_inputs": [{
"enter_text": {
"suggestion_id": None,
"text": username,
"link": "next_link"
},
"subtask_id": "LoginEnterUserIdentifier"
}]
}
)
task3 = session.post(
'https://api.twitter.com/1.1/onboarding/task.json',
json={
"flow_token": task2.json().get('flow_token'),
"subtask_inputs": [{
"enter_password": {
"password": password,
"link": "next_link"
},
"subtask_id": "LoginEnterPassword"
}],
}
)
for t3_subtask in task3.json().get('subtasks', []):
if "open_account" in t3_subtask:
return t3_subtask["open_account"]
elif "enter_text" in t3_subtask:
response_text = t3_subtask["enter_text"]["hint_text"]
totp = pyotp.TOTP(otp_secret)
generated_code = totp.now()
task4resp = session.post(
"https://api.twitter.com/1.1/onboarding/task.json",
json={
"flow_token": task3.json().get("flow_token"),
"subtask_inputs": [
{
"enter_text": {
"suggestion_id": None,
"text": generated_code,
"link": "next_link",
},
"subtask_id": "LoginTwoFactorAuthChallenge",
}
],
}
)
task4 = task4resp.json()
for t4_subtask in task4.get("subtasks", []):
if "open_account" in t4_subtask:
return t4_subtask["open_account"]
return None
if __name__ == "__main__":
if len(sys.argv) != 4:
print("Usage: %s <username> <password> <2fa secret>" % sys.argv[0])
sys.exit(1)
username = sys.argv[1]
password = sys.argv[2]
otp_secret = sys.argv[3]
result = auth(username, password, otp_secret)
if result is None:
print("Authentication failed.")
sys.exit(1)
print(json.dumps({
"oauth_token": result.get("oauth_token"),
"oauth_token_secret": result.get("oauth_token_secret"),
}, indent=2))

View File

@ -1,10 +0,0 @@
import std/[os, strutils]
import markdown
for file in walkFiles("public/md/*.md"):
let
html = markdown(readFile(file))
output = file.replace(".md", ".html")
output.writeFile(html)
echo "Rendered ", output

5
ups.json Normal file
View File

@ -0,0 +1,5 @@
{
"upstream": "https://github.com/zedeus/nitter",
"provider": "github",
"commit": "e40c61a6ae76431c570951cc4925f38523b00a82"
}