This is a Python script that zips a folder on the server and stores it in Google Drive. It is supposed to be executed regularly, and it also has a function to delete old backup files so that Google Drive does not fill up.
The GitHub repository is here.
I use this script to back up the save data of the Minecraft server, but I thought that it could be used for other purposes as well, so I wrote it in Qiita's article. By the way, I run this script on AWS EC2 (OS is Ubuntu).
To get it working, install the modules needed to use the Google Drive API with pip.
pip install --upgrade google-api-python-client google-auth-httplib2 google-auth-oauthlib
The big picture of the code is below.
import pickle
import os.path
import datetime
import shutil
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from googleapiclient.http import MediaFileUpload
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive']
#Number of generations to complement backup
GENERATIONS = 3
#Backup data directory
DATA_DIR = '/home/minecraft/server/'
#The name of the folder you want to back up
DATA_NAME = 'newworld'
#Google Drive folder ID to save backup data(Will appear in the url)
PARENT_ID = 'xxxxxx'
def main():
'''
Main function
'''
#Get credentials
creds = get_creds()
#Creating a service
service = build('drive', 'v3', credentials=creds)
#Checking files that have already been uploaded
existing_files = get_existing_files(service)
if len(existing_files) >= (GENERATIONS - 1):
#If it is stored more than necessary, delete the old one
delete_unnecessary_files(service,existing_files)
#Zip the file to upload
file_to_upload = create_zip_file()
#Upload file
upload_file(service)
def get_creds():
'''
1.Get credentials
'''
creds = None
#Try to open the authentication file
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
#If you fail, make a new one
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
return creds
def get_existing_files(service):
'''
2.Get an existing backup file
'''
# Call the Drive v3 API
query = "'" + PARENT_ID + "' in parents"
results = service.files().list(fields="nextPageToken, files(id, name, createdTime)", q=query).execute()
items = results.get('files', [])
if not items:
return {}
else:
return items
def delete_unnecessary_files(service,existing_files):
'''
3.Delete backup files that you no longer need to keep
'''
#Get the creation time of each file and convert it to a datetime object
for f in existing_files:
f['datetime_createdTime'] = datetime.datetime.strptime(f['createdTime'], '%Y-%m-%dT%H:%M:%S.%fZ')
#Sort by creation date
sorted_files = sorted(existing_files, key=lambda x: x['datetime_createdTime'])
delete_len = len(sorted_files) - (GENERATIONS - 1)
for i in range(delete_len):
service.files().delete(fileId=sorted_files[i]['id']).execute()
def create_zip_file():
'''
4.Zip the folder you want to back up
'''
shutil.make_archive(DATA_DIR + DATA_NAME, 'zip', root_dir=DATA_DIR + DATA_NAME)
def upload_file(service):
'''
5.Upload file
'''
today_str = datetime.datetime.now().strftime("%D").replace("/","-")
file_metadata = {'name': today_str + '.zip','parents':[PARENT_ID]}
media = MediaFileUpload(DATA_DIR + DATA_NAME + '.zip', mimetype='application/zip')
results = service.files().create(body=file_metadata,media_body=media,fields='id').execute()
if __name__ == '__main__':
main()
For the authentication information acquisition part, the thing of Google Drive API official sample is used as it is.
The first time you run it, your browser will open and ask for permission to authenticate. When the authentication flow is completed, the authentication information will be saved in token.pickle
, so you will not need to authenticate with the browser from the next time.
Fetch the file already saved as a backup in the folder of Google Drive. Official sample of Google Drive API I am trying to get the files under a specific folder by slightly modifying the thing. I will.
Delete unnecessary files so that Google Drive does not fill up. The file data on Google Drive acquired in the previous step is sorted in the order of upload time, and the oldest one is deleted.
That's right. I am using shututil.
Upload the zip file. By tweaking the metadata of the file, I try to upload it under a specific folder.
I have the above script set to run when the server starts. There are several ways to do this, but I used cron. Specifically, I wrote the following in crontab
.
@reboot python3 /home/minecraft/tools/backup.py
By adding @ reboot
, it will be executed at boot time.
Recommended Posts