Apparently, Python's Durable Functions became a Public Preview on June 24, 2020 ...
Durable Functions now supports Python
So, I quickly tried it locally.
Documentation and GitHub Based on -durable-python), we have prepared the following verification environment.
Create a project using the Azure Functions extension. Follow the extension prompts to create a Python function project.
Open "requirements.txt" of the created project and add the module "azure-functions-durable> = 1.0.0b6" for Durable Functions.
Open a VS Code terminal and activate the Python virtual environment created in your project. Install the module with "requirements.txt" in the Python virtual environment.
> .\.venv\Scripts\activate
> python -m pip install -r requirements.txt
Once the project is created, create the Orchestrator, Activity, and Client functions respectively. Documentation is supposed to use the template for Durable Functions, but at the moment Since there is no template, create it with the template of "Http Trigger" and rewrite the contents of "\ _init \ _. Py" and "functions.json".
Create a "durable-activity" function with the "Http Trigger" template, and rewrite "\ _init \ _. Py" and "functions.json" with the following contents. The content is as simple as returning the value passed to the function with "Hello" at the beginning. In order to make the execution state of the function easy to understand, the process is waited for 2 seconds so that "Activity {name}" is displayed in the log.
_init_.py
import logging
import time
def main(name: str) -> str:
time.sleep(2)
logging.warning(f"Activity {name}")
return f'Hello {name}!'
functions.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "name",
"type": "activityTrigger",
"direction": "in",
"datatype": "string"
}
],
"disabled": false
}
Create a "durable-orchestrator" function with the "Http Trigger" template, and rewrite "\ _init \ _. Py" and "functions.json" with the following contents. The values of "Tokyo", "Seattle", and "London" are passed to the Activity function created earlier, and the result is stored in an array. Yield is attached to each call to operate as a function chain. The method of calling the Activity function is the same as JavaScript, as long as the function becomes a snake case.
_init_.py
import azure.durable_functions as df
def orchestrator_function(context: df.DurableOrchestrationContext):
#Call the Activity function
task1 = yield context.call_activity("durable-activity", "Tokyo")
task2 = yield context.call_activity("durable-activity", "Seattle")
task3 = yield context.call_activity("durable-activity", "London")
outputs = [task1, task2, task3]
return outputs
main = df.Orchestrator.create(orchestrator_function)
functions.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "context",
"type": "orchestrationTrigger",
"direction": "in"
}
],
"disabled": false
}
Create a "durable-client" function with the "Http Trigger" template, and rewrite "\ _init \ _. Py" and "functions.json" with the following contents. The call is made by specifying the Orchestrator function in "client.start_new". The calling method is the same as JavaScript, but the binding type is "durableClient" in JavaScript, but it is also different from "orchestrationClient".
_init_.py
import logging
from azure.durable_functions import DurableOrchestrationClient
import azure.functions as func
async def main(req: func.HttpRequest, starter: str, message):
logging.info(starter)
client = DurableOrchestrationClient(starter)
#Start of Orchestrator
instance_id = await client.start_new('durable-orchestrator')
response = client.create_check_status_response(req, instance_id)
message.set(response)
functions.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"methods": [
"post",
"get"
]
},
{
"direction": "out",
"name": "message",
"type": "http"
},
{
"name": "starter",
"type": "orchestrationClient",
"direction": "in",
"datatype": "string"
}
]
}
Now that we've created the function, let's run it locally. Durable Functions require Azure Storage due to its mechanism, so if you want to run it locally, use Azure Storage Emulator. If you want to use the emulator, open "local.settings.json" at the root of the project, assign "UseDevelopmentStorage = true" to "AzureWebJobsStorage", and start the Azure Storage Emulator.
json:local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "python"
}
}
Press F5 to enter debug mode. When the process is started, the URL of the Client function is displayed, so access it with a client such as Postman.
When you make a Get request with Postman, the log starts to flow to the terminal, and the state management URL is returned as a response.
If you check the displayed log, you can see that the yellow letters of Warning are displayed in the order of "Activity Tokyo", "Activity Seattle", "Activity London" approximately every 2 seconds. You can see that the function chain is working properly.
When I checked the status of Orchestrator with "statusQueryGetUri" in the response, the runtimeStatus was "Completed", and an array of "Hello Tokyo!", "Hello Seattle!", And "Hello London!" Was obtained as the output.
Fan-Out/Fan-In I will also try Fan-Out / Fan-In because it is a big deal. Rewrite the Orchestrator function as follows.
_init_.py
import azure.durable_functions as df
def orchestrator_function(context: df.DurableOrchestrationContext):
#Call the Activity function
# task1 = yield context.call_activity("durable-activity", "Tokyo")
# task2 = yield context.call_activity("durable-activity", "Seattle")
# task3 = yield context.call_activity("durable-activity", "London")
# outputs = [task1, task2, task3]
tasks = []
tasks.append(context.call_activity("durable-activity", "Tokyo"))
tasks.append(context.call_activity("durable-activity", "Seattle"))
tasks.append(context.call_activity("durable-activity", "London"))
outputs = yield context.task_all(tasks)
return outputs
main = df.Orchestrator.create(orchestrator_function)
I tried to execute the rewritten one. Tasks registered in the order of "Tokyo", "Seattle", and "London" were processed in random order as "Seattle", "London", and "Tokyo". However, each process occurred every 2 seconds, and it did not seem to be the parallel processing expected for Fan-Out / Fan-In. It's still in the preview stage, so I'm expecting it in the future.
Although it is a Python version, it is Durable Functions, so the scripting method is the same as JavaScript, and it seems that you can develop without difficulty if you have development experience with JavaScript. As described in Release, in the case of Python, machine learning and data analysis It seems to be interesting as a utilization method that a new scenario such as building a parallel processing environment of data without a server has become possible.
Recommended Posts