Azure Functions supports the Python runtime. However, I couldn't find a way to trigger the input to the storage → Functions processing → Output to the storage, so I will make a note as a memorandum.
json:local.settings.json
{
  "IsEncrypted": false,
  "Values": {
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=<Storage account>;AccountKey=<Account key>;EndpointSuffix=core.windows.net"
  }
}
function.json
{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "inputblob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "container/input/{name}",
      "connection": ""
    },
    {
      "name": "outputblob",
      "type": "blob",
      "direction": "out",
      "path": "container/output/{name}.csv",
      "connection": ""
    }
  ]
}
__init__.py
def main(inputblob: func.InputStream, outputblob: func.Out[str]):
    logging.info(f"Python blob trigger function processed blob. v2.0\n"
                 f"Name: {inputblob.name}\n"
                 f"Blob Size: {inputblob.length} bytes\n")
    input_text = inputblob.read(size=-1).decode("utf-8")
    #What you want to do
    output_text = input_text += "hoge"
    outputblob.set(output_text)
This will output the file to the path "container / output / {name} .csv" set in 2. In this case, the file name will be the input file name with ".csv" at the end.
Recommended Posts