Categories
microservices patterns

Cloud agnostic apps with DAPR

DAPR is cool, as stated on the website “APIs for building portable and reliable microservices”, it works with many clouds, and external services. As a result you only need to configure the services and then use DAPR APIs. It is true and I’ll show you. You will find the code for this article here. A must have tool when working with micro services.

Applications can use DAPR as a sidecar container or as a separate process. I’ll show you a local version of the app, where DAPR is configured to use Redis running in a container. The repo will have Azure, AWS, and GCP configuration as well.

Before we can start the adventure you have to install DAPR.

Running app locally

Configuration

You have to start off on the right foot. Because of that we have to configure secrets store where you can keep passwords and such. I could skip this step but then there is a risk someone will never find out how to do it properly and will ship passwords in plain text. Here we go.

# save under ./components/secrets.yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: local-secret-store
  namespace: default
spec:
  type: secretstores.local.file
  version: v1
  metadata:
  - name: secretsFile
    value: ../secrets.json

The file secrets.json should have all your secrets, like connection strings, user and pass pairs, etc. Don’t commit this file.

{
    "redisPass": "just a password"
}

Next file is publish subscribe configuration. Dead simple but I’d recommend going through the docs as there is much more to pub/sub. Here you can reference your secrets as shown.

# save under ./components/pubsub.yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: order_pub_sub
spec:
  type: pubsub.redis
  version: v1
  metadata:
  - name: redisHost
    value: localhost:6379
  - name: redisPass
    secretKeyRef:
      name: redisPass
      key: redisPass
auth:
  secretStore: local-secret-store

Publisher and subscriber

With config out of the way only thing left are publisher part and subscriber part. As mentioned before the app you write talks to DAPR API. This means you may use http calls or Dapr client. Best part is that no matter what is on the other end, be it Redis or PostgreSQL, your code will not change even when you change your external services.

Here goes publisher that will send events to a topic. Topic can be hosted anywhere, here is a list of supported brokers. The list is long however only 3 are stable. I really like how DAPR is approaching components certification though. There are well defined requirements to pass to advance from Alpha, to Beta, and finally to Stable.

# save under ./publisher/app.py

import logging

from dapr.clients import DaprClient
from fastapi import FastAPI
from pydantic import BaseModel

logging.basicConfig(level=logging.INFO)


app = FastAPI()


class Order(BaseModel):
    product: str


@app.post("/orders")
def orders(order: Order):
    logging.info("Received order")
    with DaprClient() as dapr_client:
        dapr_client.publish_event(
            pubsub_name="order_pub_sub",
            topic_name="orders",
            data=order.json(),
            data_content_type="application/json",
        )
    return order

Here is a consumer.

# save under ./consumer/app.py

from dapr.ext.fastapi import DaprApp
from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()
dapr_app = DaprApp(app)


class CloudEvent(BaseModel):
    datacontenttype: str
    source: str
    topic: str
    pubsubname: str
    data: dict
    id: str
    specversion: str
    tracestate: str
    type: str
    traceid: str


@dapr_app.subscribe(pubsub="order_pub_sub", topic="orders")
def orders_subscriber(event: CloudEvent):
    print("Subscriber received : %s" % event.data["product"], flush=True)
    return {"success": True}

Running the apps

Now you can run both apps together in separate terminal windows and see how they talk to each other using configured broker. For this example we are using Redis as a broker. You will see how easy is to run them on different platforms.

In the first terminal run the consumer.

$ dapr run --app-id order-processor --components-path ../components/ --app-port 8000 -- uvicorn app:app

In the other terminal run the producer.

$ dapr run --app-id order-processor --components-path ../components/ --app-port 8001 -- uvicorn app:app --port 8001

After you make a HTTP call to a producer you should see both of them producing log messages as follows.

$ http :8001/orders product=falafel

# producer
== APP == INFO:root:Received order
== APP == INFO:     127.0.0.1:49698 - "POST /orders HTTP/1.1" 200 OK

# subscriber
== APP == Subscriber received : falafel
== APP == INFO:     127.0.0.1:49701 - "POST /events/order_pub_sub/orders HTTP/1.1" 200 OK

Running app in the cloud

It took us a bit to reach the clu of this post. We had to build something, and run it so then we can run it in the cloud. Above example will run on cloud with a simple change of configuration.

Simplest configuration is for Azure. Change your pubsub.yaml so it looks as follows, and update your secrets.json as well.

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: order_pub_sub
  spec:
    type: pubsub.azure.servicebus
    version: v1
    metadata:
    - name: connectionString
      secretKeyRef:
        name: connectionStrings:azure
        key: connectionStrings:azure

Your secrets.json should look like this now

{
  "connectionStrings": {
    "azure": "YOUR CONNECTION STRING"
  }
}

Rerun both commands in terminal and the output will look the same as with local env but the app will run on Azure Service Bus.

Bloody magic if you’d ask me. You can mix and match your dependencies without changing your application. In some cases you may even use features not available to a particular cloud, like message routing based on body in Azure Service Bus. This will be another post though.

Here is the repo for this post, it includes all the providers listed below:

  • Azure
  • Google Cloud
  • AWS

Please remember to update your secrets.json.

Have fun 🙂

Categories
azure python

Azure Application Insights, FastAPI, and tracing

Debugging is difficult, what’s even more difficult, debugging production apps. Live production apps.

There are tools designed for this purpose. Azure has Application Insights, product that makes retracing history of events easier. When setup correctly you may go from a http request down to a Db call with all the query arguments. Pretty useful and definitely more convenient than sifting through log messages.

Application Insights console with end-to-end transaction details

Here you can see the exact query that had been executed on the database.

Application Insights console with SQL query details

You may also see every log related to a particular request in Log Analytics.

Application Insights logs with a query displaying only relevant logs

Improving your work life like this is pretty simple. Everything here is done using opencensus and it’s extensions. Opencensus integrates with Azure pretty nicely. First thing to do is to install required dependencies.

# pip 
pip install opencensus-ext-azure opencensus-ext-logging opencensus-ext-sqlalchemy opencensus-ext-requests

# pipenv

pipenv install opencensus-ext-azure opencensus-ext-logging opencensus-ext-sqlalchemy opencensus-ext-requests

# poetry

poetry add opencensus-ext-azure opencensus-ext-logging opencensus-ext-sqlalchemy opencensus-ext-requests

Next step is to activate them by including a couple of lines in your code. Here I activate 3 extensions, logging, requests, and sqlalchemy. Here is a list of other official extensions.

import logging

from opencensus.trace import config_integration
from opencensus.ext.azure.log_exporter import AzureLogHandler

logger = logging.getLogger(__name__)
config_integration.trace_integrations(["logging", "requests", "sqlalchemy"])

handler = AzureLogHandler(
    connection_string="InstrumentationKey=YOUR_KEY"
)
handler.setFormatter(logging.Formatter("%(traceId)s %(spanId)s %(message)s"))
logger.addHandler(handler)

One last thing is a middleware that will instrument every request. This code is taken from Microsoft’s documentation.

@app.middleware("http")
async def middlewareOpencensus(request: Request, call_next):
    tracer = Tracer(
        exporter=AzureExporter(
            connection_string="InstrumentationKey=YOUR_KEY"
        ),
        sampler=ProbabilitySampler(1.0),
    )
    with tracer.span("main") as span:
        span.span_kind = SpanKind.SERVER

        response = await call_next(request)

        tracer.add_attribute_to_current_span(
            attribute_key=HTTP_STATUS_CODE, attribute_value=response.status_code
        )
        tracer.add_attribute_to_current_span(
            attribute_key=HTTP_URL, attribute_value=str(request.url)
        )

    return response

You are done 🙂 you will not loose information on what is going on in the app. You will be quicker in finding problems and resolving them. Life’s good now.

Categories
azure typescript

Guide on Developing Azure Functions locally

Why

I have spend way too much time on figuring out how to create local development environment for working on Azure Functions. Most of the time was spent on reading Microsoft’s documentation on this topic. Hopefully this guide will save you some time 🙂

The guide had been prepared on Ubuntu and verified as well. Additionaly my coworker verified it to work on macOS High Sierra. If you are here to create your first Azure Function locally please read along.

What are azure functions

Azure Functions are small pieces of code that are run on Azure infrastructure in a serverless fashion. The Infrastructure, provisioning, scaling is Azure’s responsibility. If you wish to read more below are links to detailed explanation by Azure team. I’d recommend at least skimming the first one.

Software required for local development

There are a couple of tools required to be able to develop locally.

  • npm, tooling is mostly in JS/TS
  • .NET Core SDK, required for function extensions
  • Azure Functions Core Tools (later referred as AFCT)
  • Azurite legacy, for storage emulation. It works with Blob, Table, and Queue storages. Legacy is required as the current one supports only Blob storage.
  • Docker

Handling http requests

This part explains how to create a function that is triggered by HTTP request and responds with HTTP response.

Create Azure Function project

Start with calling func init HTTPFunc in the directory of your choosing. You will be asked a couple of questions regarding your project.

majki@enchilada ~/projects/azure-tracker-local
% func init HTTPFunction
Select a worker runtime:
1. dotnet
2. node
3. python (preview)
4. powershell (preview)
Choose option: 2
node
Select a Language:
1. javascript
2. typescript
Choose option: 2
typescript
Writing .funcignore
Writing package.json
Writing tsconfig.json
Writing .gitignore
Writing host.json
Writing local.settings.json
Writing /home/majki/projects/azure-tracker-local/HTTPFunction/.vscode/extensions.json

Create Azure Function

Create function next. As in the previous step you will be asked questions regarding the function you wish to create. I selected 8. HTTP Trigger as this function should respond to HTTP requests.

majki@enchilada ~/projects/azure-tracker-local
% cd HTTPFunction
majki@enchilada ~/projects/azure-tracker-local/HTTPFunction
% func new
Select a template:
1. Azure Blob Storage trigger
2. Azure Cosmos DB trigger
3. Durable Functions activity
4. Durable Functions HTTP starter
5. Durable Functions orchestrator
6. Azure Event Grid trigger
7. Azure Event Hub trigger
8. HTTP trigger
9. IoT Hub (Event Hub)
10. Azure Queue Storage trigger
11. SendGrid
12. Azure Service Bus Queue trigger
13. Azure Service Bus Topic trigger
14. Timer trigger
Choose option: 8
HTTP trigger
Function name: [HttpTrigger]
Writing /home/majki/projects/azure-tracker-local/HTTPFunction/HttpTrigger/index.ts
Writing /home/majki/projects/azure-tracker-local/HTTPFunction/HttpTrigger/function.json
The function "HttpTrigger" was created successfully from the "HTTP trigger" template.

Install dependencies

npm install

Run function

npm run start

Verify

majki@enchilada ~
% curl http://localhost:7071/api/HttpTrigger
Please pass a name on the query string or in the request body%
majki@enchilada ~
% curl "http://localhost:7071/api/HttpTrigger\?name\=Mike"
Hello Mike

Handling Queue Storage

This part explains how to create a function that is triggered by HTTP request to send a message onto Queue Storage.

Install storage emulation

On windows, you may go with emulator provided by Microsoft. On Ubuntu, there is community sourced Azurite. Unfortunately, the most recent version does not support Queue Storage. For queues, install version 2.7.0 by following instructions.

Create Azure Function project

Start with calling func init HTTPFunc in directory of your choosing. You will be asked couple questions regarding your project.

majki@enchilada ~/projects/azure-tracker-local
% func init HTTPFunction
Select a worker runtime:
1. dotnet
2. node
3. python (preview)
4. powershell (preview)
Choose option: 2
node
Select a Language:
1. javascript
2. typescript
Choose option: 2
typescript
Writing .funcignore
Writing package.json
Writing tsconfig.json
Writing .gitignore
Writing host.json
Writing local.settings.json
Writing /home/majki/projects/azure-tracker-local/HTTPFunction/.vscode/extensions.json

Create function

Create function next. As in previous step you will be asked questions regarding function you wish to create. I selected 8. HTTP Trigger as this function should respond to HTTP requests, it will in response send message to Queue Storage.

majki@enchilada ~/projects/azure-tracker-local
% cd HTTPFunction
majki@enchilada ~/projects/azure-tracker-local/HTTPFunction
% func new
Select a template:
1. Azure Blob Storage trigger
2. Azure Cosmos DB trigger
3. Durable Functions activity
4. Durable Functions HTTP starter
5. Durable Functions orchestrator
6. Azure Event Grid trigger
7. Azure Event Hub trigger
8. HTTP trigger
9. IoT Hub (Event Hub)
10. Azure Queue Storage trigger
11. SendGrid
12. Azure Service Bus Queue trigger
13. Azure Service Bus Topic trigger
14. Timer trigger
Choose option: 8
HTTP trigger
Function name: [HttpTrigger] HTTP2Queue
Writing /home/majki/projects/azure-tracker-local/HTTPFunction/HTTP2Queue/index.ts
Writing /home/majki/projects/azure-tracker-local/HTTPFunction/HTTP2Queue/function.json
The function "HTTP2Queue" was created successfully from the "HTTP trigger" template.

Configure function output

In the editor open function.json file located in the directory named after the name of the function. Remove the section related to http output and replace with the queue output as shown below:

{
  "type": "queue",
  "direction": "out",
  "name": "msg",
  "queueName": "outgoingqueue",
  "connection": "MyQueueConnectionString"
}

The file is suppose to look like this:

{
    "bindings": [
        {
            "authLevel": "function",
            "type": "httpTrigger",
            "direction": "in",
            "name": "req",
            "methods": [
                "get",
                "post"
            ]
        },
        {
            "type": "queue",
            "direction": "out",
            "name": "msg",
            "queueName": "outgoingqueue",
            "connection": "MyQueueConnectionString"
        }
    ],
    "scriptFile": "../dist/HTTP2Queue/index.js"
}

Configure queue connection

Some connections defined in function.json require to have a connection string. After adding an outgoing queue connection you need to add a connection string as well. String has to be added to Define connection in local.settings.json by adding "MyQeueueConnectionString". The value of this string is taken from documentation of Azurite.

{
    "IsEncrypted": false,
    "Values": {
        "FUNCTIONS_WORKER_RUNTIME": "node",
        "AzureWebJobsStorage": "{AzureWebJobsStorage}",
        "MyQueueConnectionString": "DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;"
    }
}

Install dependencies

npm install

Install extensions

Since we are using an output that is different than http, we have to install storage extensions to make the queue work.

func extensions install

Run azurite

Run Azurite by using Docker

docker run -e executable=queue -d -t -p 10001:10001 -v queue_Storage:/opt/azurite/folder arafato/azurite

More information on how to run it is here.

Run function

In order to put any message on the queue, the function needs to be modified a bit. Please change it accordingly.

const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
    context.log('HTTP trigger function processed a request.');
    const name = (req.query.name || (req.body && req.body.name));
    context.bindings.msg = name;
    context.done();
};

And then

npm run start

Verify

Now we can see how this is working. We are going to call our function a couple of times and then see what messages had been put on the queue.

majki@enchilada ~
% curl "http://127.0.0.1:7071/api/HTTP2Queue?name=Mike"
HTTP/1.1 204 No Content
Content-Length: 0
Date: Tue, 14 May 2019 06:39:25 GMT
Server: Kestrel
majki@enchilada ~
% curl "http://127.0.0.1:7071/api/HTTP2Queue?name=Mickey"
HTTP/1.1 204 No Content
Content-Length: 0
Date: Tue, 14 May 2019 06:39:28 GMT
Server: Kestrel

There should be two messages on the queue at this moment. We can query the queue by running:

majki@enchilada ~
% curl "http://127.0.0.1:10001/devstoreaccount1/outgoingqueue/messages?peekonly=true&numofmessages=10"
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 885
Content-Type: application/xml; charset=utf-8
ETag: W/"375-pdRmS0dxI0dhZlfRbwGMunhL+WM"
X-Powered-By: Express
date: Tue, 14 May 2019 06:42:39 GMT
x-ms-request-id: 77856d80-7613-11e9-afc4-b5f560b4039e
x-ms-version: 2016-05-31

<?xml version='1.0'?><QueueMessagesList><QueueMessage><MessageId>cae264ef-74bd-4bde-b03e-01be8821968a</MessageId><InsertionTime>Tue, 14 May 2019 06:29:04 GMT</InsertionTime><ExpirationTime>Tue, 21 May 2019 06:29:04 GMT</ExpirationTime><DequeueCount>0</DequeueCount><MessageText>TWlrZQ==</MessageText></QueueMessage><QueueMessage><MessageId>13732448-a498-4fc5-85cf-8c4afafe082b</MessageId><InsertionTime>Tue, 14 May 2019 06:39:25 GMT</InsertionTime><ExpirationTime>Tue, 21 May 2019 06:39:25 GMT</ExpirationTime><DequeueCount>0</DequeueCount><MessageText>TWlrZQ==</MessageText></QueueMessage></QueueMessagesList>

Azurite supports Queue Storage API, so if you need to know more operations read linked docs.