Host your Database for Free on Github Pages

Databases are an essential part of many modern web applications but running them can be really expensive. That’s why in this video I’ll show you how you can run an SQLite Database completely for free, on top of Girhub Pages. GitHub Pages allows you to host files and websites. But using the JavaScript library SQL.js-HTTPVFS it is possible to use an SQLite dataset completely hosted as static files.
===
Check out my project „solidtime - The modern Open-Source Time Tracker„ on www.solidtime.io
===
The final result of the „On this day“ can be found here: onthisday.bufferhead.com/
The GitHub repository with all the source code: github.com/bufferhead-code/on...
===
Check out SQL.js-HTTPVFS at:
github.com/phiresky/sql.js-ht...
If you want to find out more about it, the author explained the initial implementation here:
phiresky.github.io/blog/2021/...
You can find the dataset used for the „On this day“ website here:
www.kaggle.com/datasets/guill...
===
Regular databases like MySQL, Postgres or Redis/Valkey run as a service and need server side computing power work. But SQLite is quite a bit different. The Database runs as a single file database and is really useful for various applications that need a performant but simple database. You can use SQL to query the database just like with any other database. The main difference is that we can use the fact that SQLite only needs a filesystem to run to distribute it with static page hosting services like Guthub Pages. It is also possible to use CDNs like Cloudflare or BunnyCDN to distribute the database very cheap. We use the HTTP Protocol to our advantage and set an HTTP range header to request only the parts of the SQLite database we actually need in our application.

Пікірлер: 95

  • @aroyanbs
    @aroyanbs

    Laravel 11 use SQLite by default now. CMIIW

  • @ea02ca6f
    @ea02ca6f

    i got excited cuz i thought you figured out a way to do writes, not just read.

  • @abraham_o
    @abraham_o

    Just becasue you can doesn't mean you should.

  • @alifdarsim7767
    @alifdarsim7767

    This is very bad idea to follow. Not only u cant do write, but the read speed will be very slow compare to normal approach

  • @kirarevcrow
    @kirarevcrow

    This is misleading, click bait

  • @bobkelso5681
    @bobkelso5681

    If you cannot do writes, it is not a database at all.

  • @Nodsaibot
    @Nodsaibot

    pffft just have a csv file

  • @geekofia
    @geekofia

    Don't abuse those free platforms

  • @raiyansarker
    @raiyansarker

    just use turso

  • @filipegoncalves3739
    @filipegoncalves3739

    Before watching you should know that it's only a read """database""", you can't really write to it so it's basically just an API and not a database like the video and title say

  • @TheHTMLCode
    @TheHTMLCode

    I’m so glad that I watched until the end, I was like “how the hell does it know what range of bytes to load if the data isn’t clustered”. Nice project and explanation, thanks :)

  • @3RR0RNULL
    @3RR0RNULL

    Nah I’m convinced KZread can read my thoughts. I was just thinking about how I could host a database for free yesterday, and lo and behold… this video the next day.

  • @littleharry7977
    @littleharry7977

    Always love it when you upload!

  • @thethiny
    @thethiny

    If your Database is static and can be pre-cached (such as to your bytes-range feature) then it shouldn't have been a database to begin with, multiple JSONs would've been the better option.

  • @goatslayer5957
    @goatslayer5957

    Really cool, always super interested in nodb approaches!! Thanks for sharing!

  • @UTJK.
    @UTJK.

    Thanks for sharing! I always wondered if a configuration like this was feasible.

  • @Vorono4ka
    @Vorono4ka

    Site looks awesome, like this design

  • @Dogo.R
    @Dogo.R

    Why not contribute to activitywatcher?

  • @BohAiCov
    @BohAiCov28 күн бұрын

    I can't find a tutorial on how to actually do this, i seek suggestions and help

  • @darkmift
    @darkmift

    Try splitting your data into per decade databases and query based on that.