Git Product home page Git Product logo

gazelle's People

Contributors

asutekku avatar battleprogrammershirase avatar dependabot[bot] avatar olet-toporkov avatar pjc09h avatar thisis-myname avatar tricidious avatar xoru avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

gazelle's Issues

Move all social features to a Discourse backend

Replace huge swaths of garbage homebrew code for nonessential features to a Discourse API backend running in a Docker container. Several steps to this migration:

  • set up the Discourse Connect SSO with automatic forum login
  • finish mocking up the forums, wiki, torrent comments, news/blog, user profile, and private message interfaces
  • support the full CRUD operations of the relevant Discourse features
  • lock all social stuff behind an authentication challenge ("you must be logged in to view the forums")
  • migrate the existing data to Discourse
  • proxy the Discourse API through the BioTorrents.de one

This will free up a lot of time to focus on the actual torrent features. There are some longstanding forum bugs, as well as huge SQLi potential, that I don't want to fix (e.g., locking and moving threads has never worked right).

This should be reasonably feature complete compared to what currently exists, except with a better frontend and overall cleaner backend logic.

Bearer token scopes

Can be pretty simple, sliced by section or HTTP method. Whatever works and is easiest.

Buy and test a U2F hardware key

I recently received a YubiKey 5C NFC essentially for free, so I can now develop and test the FIDO2 authentication feature. It will be implemented using the native browser WebAuthn specification, of course, and not rely on the couple of deprecated libraries that OT Gazelle used.

All integer primary keys should be a bigint and most database tables should have a UUID v7 unique key

Branching off the work in creatorObjects to position the database for scale. I've been meaning to implement some kind of basic sharding and replication since the beginning, which relies on not having key collisions. UUID v7 stored as binary(16) as a unique key, while maintaining the standard auto-increment id bigint columns, seems to be the way to go.

The database class is already set up to transparently handle UUID binary to string conversion so, e.g., select uuid, name from creators order by created desc limit 10 will return UUID's in the form of 01877b4a-b27c-70db-9522-149e9a40ef59.

UUID documentation:
https://uuid.ramsey.dev/en/stable/rfc4122/version7.html
https://uuid.ramsey.dev/en/stable/database.html

Sharding documentation:
https://aws.amazon.com/what-is/database-sharding/
https://www.linode.com/docs/guides/sharded-database/

Misc documentation:
https://emmer.dev/blog/why-you-should-use-uuids-for-your-primary-keys/
https://itnext.io/laravel-the-mysterious-ordered-uuid-29e7500b4f8
https://stackoverflow.com/questions/52414414/best-practices-on-primary-key-auto-increment-and-uuid-in-sql-databases
https://tomharrisonjr.com/uuid-or-guid-as-primary-keys-be-careful-7b2aa3dcb439
https://vladmihalcea.com/uuid-database-primary-key/
https://www.mysqltutorial.org/mysql-uuid/
https://www.percona.com/blog/store-uuid-optimized-way/

Rethink bonus points and user classes to act more like a hostile bank account

The way that user classes and bonus points currently work: you rank up by having upload activity (or buying upload with BP) and you get a large amount of BP for your seed size. This should be reversed, where user ranks depend on a minimum average seed size and BP are negatively compounded. I know "hostile bank account" is a tautology.

Soft deletes for torrents

Would be pretty useful: DMCA request comes in, we soft delete it, turns out the request is abusive, nothing is lost.

Turn authors (creators) into first-class objects

Currently, the artist tables in the database are all linking tables. Torrent creators / study authors / artists / etc. should be their own object in the logical schema similar to a torrent group, that can be independently indexed and searched.

Move everything over to clean routes

I'm done with /foo.php?bar=baz in the web interface and API. Flight support is coming along. The best part is, it breaks all the leetcode and enforces strict standards.

Refactor the JavaScript: IIFE's and event listeners

Currently, a lot of the site JavaScript uses raw functions dumped into a file. This causes problems with Google Closure Compiler, which in its advanced optimization mode, aggressively rewrites function names. This prevents me from using anything more than simple optimizations. The solution is to encapsulate all JavaScript in self-executing arrow functions (() => { /* ... */ })(); whose contents look for certain events such as clicking a widget and such.

plz be my ai gf

https://github.com/biotorrents/gazelle/blob/openai/app/OpenAI.php

OpenAI API integration for tl;dr torrent group summaries and keywords. Need to get as much production database coverage as possible before my free trial credits expire in April 2023 or so. This is largely done and will be merged into the authentication branch soon.

Pardon the delay! It turns out that rewriting the whole authentication, template, database, and a lot of other stuff became essentially a full application rewrite. Once everything is tested, I'll just merge it, even if it means the forums and wiki might go away for a while.

Login is broken on dev

Can't log into the dev instance (whoops). Good time to just rewrite the crazy system to use a secure library with sensible paths.

Full API CRUD support

Currently, the API only supports GET requests. It should use controllers for all the major objects of the site with simple methods like create(), read(), update(), and delete().

Implement database replication in the Gazelle codebase

The new database class should transparently pull data from a replica if the methods single, row, column, or multi are called, and write data to the source if do is called. Both scenarios should support an array of database instances, but realistically, there's only one of each and that's way overkill.

Rewrite the torrent search backend

The frontend is mostly done, screenshot attached. The backend has always been a mess. I don't like messy logic in the core of the app (the whole point is to efficiently index and serve data), so I'm gonna rewrite the Sphinx backend with something a bit more simple and clean. I've had a look at the library source code, it seems to be okay. With time, we can probably index collections, requests, and maybe even Top 10 history.

newTorrentSearch

Namespace the damn app already

First class collision occurred with OpenAI. Will start with purely static classes (most of them) and work my way toward the other classes. PSR-4 support has been a thing in composer.json for a while, probably deleted because JSON doesn't support comments. No use statements because I like to know what's going on.

Use models for at least some core objects

A set of basic models that extend a subset of Laravel Eloquent's features, e.g., find(), save(), delete() (soft), etc., would go a huge way toward cleaning up the code. Each major artifact such as a torrent, group, collection, request, etc., should be its own "thing" that can be loaded, displayed, and manipulated. Also has implications for API CRUD support: load the model, change some stuff, and save it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.