Drive Strm

The Trials and tribulations of getting drive_strm to work

Most of this is built from my best guesses from looking at the source code, and asking around in discourse and the like, as there are no docs (as yet)

strm files are only useable with emby. Plex removed support for them. boo.

drive_strm is a Python app that does a couple of things

Firstly it can generate strm files from your video files stored in google drive.
It can include the various versions that google create from your originals (360p, 480p, 720p, 1080p)

The strm files are simply text files that contain a link. This link contains the name of the file that google uses to store it, **it is not the name of your file, but an unreadable collection of text and numbers ** and the url is a link to the machine that is running the drive_strm script, not a direct link to google. This is where the second part comes in

Secondly and quite important, it also acts a proxy to serve up these files.

As I understand it, as the files on your drive are not publicly shared, it needs to authenticate against google in order to get the file. (Please feel free to correct me if I am wrong)

When your media player opens the strm file (emby, Kodi, VLC) The request is sent to the machine running drive_strm to authenticate and then serve the file.

UPDATE It seems that if you try and play the strm file for the original some players/clients will try and play this directly from google, so not using the bandwidth on the machine that has drive_strm running on.

Only tested using VLC so far. But… playing a transcoded version pulls the file through the proxy and therefore does use your servers bandwidth!

It seems that player support is pretty poor for direct playing from google.

Player results:

These are the results I got when testing various players

VLC: Direct Play From Google for the Original, Transcodes through the proxy
Kodi: Direct Play From Google for the Original, Transcodes through the proxy
Desktop Player: Direct Play From Google for the Original, Transcodes through the proxy
IOS: All playback requests go through the proxy
Android Clients: TBC

Now I don’t think it matters where the proxy is running so my latest idea… is to try running the proxy somewhere with unlimited bandwidth that is different to where the emby server is, as my emby server machine does have a bandwidth limit. I am just generating some strm files, so will see if that works and report back.

My Initial aim was to get it running on my Synology Nas

But the nas does not have a new enough version of Python available. To run it on the NAS will require using the Docker version. So I initially set it up using my Mac

Getting it to work

Basics (standalone)

Get Google Drive API Credentials:

Usual process documented already in many places if you don’t know how google it

Install Python 3.7 or upgrade your Python to 3.7 and install Pip for Python 3.7

On my Mac I use Mac Ports so did it that way

On an ubuntu box, to install it into your home directory, you could try this (should work if have the tools for building software installed):

mkdir -p ~/python/python37
wget -qO ~/python.tar.xz
tar xf ~/python.tar.xz && cd ~/Python-3.7.4
./configure --prefix=$HOME/python/python37 && make && make install

Download drive_strm code from github and install all the python gubbins you need from the requirements text file, so for the example below the drive_strm folder is in the root of your home directory

~/python/python37/bin/pip3 install --no-cache-dir --upgrade -r ~/drive_strm/requirements.txt

This should install all the required python modules.

Edit the config file, see the sample file provided.

This is an example of mine with real data removed, but I do not have any team drives, so if you have then check the example config file provided.

  "google": {
    "allowed": {
        "file_paths": [
          "My Drive/path/to/files/to/create/strm/files/from/"
        "file_extensions": false,
        "mime_types": true,
        "mime_types_list": ["video"]
    "client_id": "add_your_client_id",
    "client_secret": "add_your_client_secret",
    "maindrive": true,
    "teamdrive": false,
    "teamdrives": [],
    "poll_interval": 120
  "server": {
    "listen_ip": "",
    "listen_port": 7249,
    "direct_streams": false
  "strm": {
    "access_url": "http://valid_url_or_ipaddress:7249/",
    "root_path": "/path_where_strm_files_stored/strm",
    "remove_empty_dirs": true,
    "empty_dir_depth": 4,
    "show_transcodes": true,
    "chunk_size": 250000

Run script with the authorize flag (I have used full paths to make sure it runs the correct version of python as you may have more than 1 version of python installed.)

~/python/python37/bin/python3 ~/drive_strm/ authorize

You should get couple of lines of text including a link, like:

| INFO     | __main__:authorize:121 - Visit the link below and paste the authorization code
| INFO     | __main__:authorize:122 -

Copy the link into a browser and sign in to your google account, after signing in, you should then be presented with a token. Copy this token and paste it where it says Enter authorization code:

This will create a vault.db and put the authentication into this.

NOTE: It would be prudent to make a copy of the vault.db file at this point. As I needed to use it when running the docker version. You can copy this file to another install and you will not need to go through the auth process.

So you should be all set to start the script, at which point it should start scanning through your drive and looking for the directories you specified in the config file.

Run the script with the run command:

~/python/python37/bin/python3 ~/drive_strm/ run

and with any luck it will start to generate the strm files for you.
This might take a while depending on size of your drive. (I am talking probably hours)

Once done you can create a library in emby and point to these files, and if there are lots wait a while while emby processes them

You can and may eventually want run it in docker, but I find it quicker and easier to start without docker, as the auth is a bit tricky with docker.

Although if have a copy of your vault.db that you took earlier, not so much. I think the issue is that the docker image is designed to work with the cloudbox system so that is probably responsible for adding the auth stuff into the vault.db file. I may need to look at the code from an earlier version as I think you used to pass the token to docker and see if I can create a modified version tht still does that while while keeping newer changes

I did find the quickest way to run at least for the initial strm generating was on the mac, seemed way quicker than anything else. So I edited the config file to put the url to where the suff will end up, and then copy the DB and strm files to the final server one the process is complete. Not sure if this will pick up new changes, will have to see.