AZ-204 - Developer associate - Connect and consume services
Last updated Jul 6, 2022
Published Jul 1, 2022
Service bus
For decoupling applications
- basic
- queues / no topics
- message size 256kb or 1mb for premium
- standard
- queues
- topics
- premium
- Create queue
- Peek
- Receive
- it possible to specify the message content type (by default it uses xml)
- Policies
- can be used to allow read or write
- package to integrate with c# Azure.Messaging.ServiceBus
- Queue
- ServiceBusClient to connect to service bus
- ServiceBusReceiver to peek messages from the
- Peek and Lock
- uses CompleteMessageAsync to delete the message
- Receive and Delete
- DeadLetterQueue stores expired messages
- to access DeadLetter append $DeadLetterQueue to both connection string and queue name
- Duplicate detection
- is enabled on the queue creation
- prevents duplication based on a time frame
- Topics
- are based on subscribers (and it can be used with policies as well)
- Topic filers
- sql filters (sys.* stores system properties generated by azure to be used)
- boolean filters (based on the sql filter)
- correlation filers - based on content type
- Queue
- CLI
- az servicebus namespace create –name –resource-group –location –sku
- az servicebus queue create –resource-group –namespace-name –name –max-size 1024
- az servicebus topic create –resource-group –namespace-name –name –max-size 1024
- az servicebus topic subscription create –resource-group –namespace-name –topic-name –name
Event Grid
Event grid is a central service running on azure that can react on changes in different resources.
- Maximum size of the event is 1mb
- storage account is used as a source of events
- through the ui there is the section Events to subscribe to events generated inside the resource
- for storage account those events could be: blob created, blob deleted etc
- event contains information about itself
- event grid is the central place that those events are sent
- azure function can subscribe to the event grid and listen to events
- To debug locally event grid needs access to local development from outside via ngrok
Event grid schema
- Event is only the data
- Event grid offer an event grid schema
-
Events can take up to 1mb
- Storage queue handler
- Send events to a queue
- Events filters
- Resource groups events
- Fire events from resource groups (Creating, deleting etc)
- supports advanced filters as well
- Event grid supports http trigger
- requires handshake first
- validation code
- validation url
- to handle handshake even grid sends an event of type SubscriptionValidationData
- parse this event to get the validation code and validation url
- creates a SubscriptionValidationResponse with the validation code
- send back the response
- Custom topics
- Event grid supports custom topics creation
- first step is to create a event grid topic
- package for this one is Azure.Messaging.EventGrid
Azure Event Grid vs Function trigger
- Both can be used to send events
- Event grid is preferred for high throughput (more than 100k blobs in the storage or 100 blob updates per second)
Event hub
Event hub is a big data streaming platform (For telemetry data).
- message size 256kb for basic and 1mb for standard
- stream log data
- telemetry data
- event hub receives data via http, kafka or amqp protocol
- data is partitioned
- consumer group
- throughput
- event receivers
Azure CLI has the command az eventhubs eventhub, for example, fetching the connection string from event hub as described in the microsoft documentation
- Creating event hub
- namespace -> location -> once namespace is ready then event hub creation is allowed
- pricing ?
- after creation, the partition number can’t be changed
- Interacting with
- package Azure.Messaging.EventHubs
- connect via namespace and policies (namespace policies are applied to all things inside the namespace)
- sending events to event hub
- EventHubProducerClient is used to connect to event hub
- EventDataBatch is returned from CreateBatchAsync
- TryAdd is used to add the EventData to the batch
- EventHubProducerClient.SendAsync is used to send the batch
- consuming events from event hub
- EventHubProducerClient is used to connect to event hub
- Receiver needs to be in a ConsumerGroup - byd default azure creates one ($Default)
- EventHubConsumerClient is used to connect to event hub as a client
- ReadEventsAsync to return a PartitionEvent
- With PartitionEvent there is a property called Data
- EventBody is sent in bytes, needs to convert that in c#
- for better throughput use a consumer for partition - recommended by microsoft
- allowed 5 concurrent readers per partition
- From EventHubConsumerClient fetch the partition ids with the method GetPartitionIdsAsync
- With the id list uses the method ReadEventsFromPartitionAsync to fetch events from a given partition
- package Azure.Messaging.EventHubs.Processor -> live listener of the changes
- it requires a storage account
- creates a policy to listen
- It uses BobContainerClient to connect to the storage
- EventProcessorClient listen to changes
- Captures
- allows for streaming data from even hub to blob storage or azure data lake
- stored in .avro format
- Streaming azure sql database logs
- in the sql database
- diagnostics setting
- destination can be set to even hub
- subscriptions
- namespace
- name
- policy name
API management
-
Policies
- API management transformation policies
- implemented before the request is redirected to the backend
- or the other way around
- policies are in xml
- IP restrictions
- policies editor
- under inbound rule
- ip-filter tag
- with a tag address inside
- API management cache
- caches the requests made to the api
Table of contents
Got a question?
If you have question or feedback, don't think twice and click here to leave a comment. Just want to support me? Buy me a coffee!