AZ-204 Developer Associate: Navigating Azure Service Connectivity and Consumption

Last updated Jul 11, 2023 Published Jul 1, 2022

The content here is under the Attribution 4.0 International (CC BY 4.0) license

Consuming Azure services and connecting to them are responsible for 15% - 20% of the exam, based on the mock exams, most of the questions are related to storage queues, servicebus, event grid and API management (policies), this section goes over each service that could potentially appear in a real az-204 exam.

Service bus

For decoupling applications

  • basic
    • queues / no topics
    • message size 256kb or 1mb for premium
  • standard
    • queues
    • topics
  • premium
  • Create queue
    • Peek
    • Receive
    • it possible to specify the message content type (by default it uses xml)
  • Policies
    • can be used to allow read or write
  • package to integrate with c# Azure.Messaging.ServiceBus
    • Queue
      • ServiceBusClient to connect to service bus
      • ServiceBusReceiver to peek messages from the
      • Peek and Lock
        • uses CompleteMessageAsync to delete the message
      • Receive and Delete
      • DeadLetterQueue stores expired messages
      • to access DeadLetter append $DeadLetterQueue to both connection string and queue name
    • Duplicate detection
      • is enabled on the queue creation
      • prevents duplication based on a time frame
    • Topics
      • are based on subscribers (and it can be used with policies as well)
    • Topic filers
      • sql filters (sys.* stores system properties generated by azure to be used)
      • boolean filters (based on the sql filter)
      • correlation filers - based on content type


  • az servicebus namespace create –name –resource-group –location –sku
  • az servicebus queue create –resource-group –namespace-name –name –max-size 1024
  • az servicebus topic create –resource-group –namespace-name –name –max-size 1024
  • az servicebus topic subscription create –resource-group –namespace-name –topic-name –name


To increase throughput when the number of senders is high but the consumers are low, the commended approach is:

Queue with delay

Service bus also offers something to queue the message after some delay.


Event Grid

Event grid is a central service running on azure that can react on changes in different resources.

  • Maximum size of the event is 1mb (json)
  • storage account is used as a source of events
    • through the ui there is the section Events to subscribe to events generated inside the resource
    • for storage account those events could be: blob created, blob deleted etc
  • event contains information about itself
  • event grid is the central place that those events are sent
  • azure function can subscribe to the event grid and listen to events
  • To debug locally event grid needs access to local development from outside via ngrok

Event grid schema

  • Storage queue handler
    • Send events to a queue
    • Events filters
  • Resource groups events
    • Fire events from resource groups (Creating, deleting etc)
    • supports advanced filters as well
  • Event grid supports http trigger
    • requires handshake first
    • validation code
    • validation url
    • to handle handshake even grid sends an event of type SubscriptionValidationData
      • parse this event to get the validation code and validation url
      • creates a SubscriptionValidationResponse with the validation code
      • send back the response
  • Custom topics
    • Event grid supports custom topics creation
    • first step is to create a event grid topic
    • package for this one is Azure.Messaging.EventGrid

Azure Event Grid event types

  • EventTypes
    • Filter failure or success event for ayn resource deployed to azure subscription
  • Subject begins or ends with
    • Filter an event whenever an object are added to a specific container in azure blob
  • Advanced fields and operators
    • Filter messages by values in the data field and specify the comparison operator

Azure Event Grid vs Function trigger

  • Both can be used to send events
  • Event grid is preferred for high throughput (more than 100k blobs in the storage or 100 blob updates per second)

Event grid operation name


Event hub

Event hub is a big data streaming platform (For telemetry data) and general purpose queue, it has the following properties:

  • message size 256kb for basic and 1mb for standard
  • stream log data
  • telemetry data
  • event hub receives data via http, kafka or amqp protocol
  • data is partitioned
  • consumer group
  • throughput
  • event receivers

Creating event hub

  • namespace -> location -> once namespace is ready then event hub creation is allowed
    • pricing ?
    • after creation, the partition number can’t be changed
  • Interacting with
    • package Azure.Messaging.EventHubs
    • connect via namespace and policies (namespace policies are applied to all things inside the namespace)
    • sending events to event hub
      • EventHubProducerClient is used to connect to event hub
      • EventDataBatch is returned from CreateBatchAsync
      • TryAdd is used to add the EventData to the batch
      • EventHubProducerClient.SendAsync is used to send the batch
    • consuming events from event hub
      • EventHubProducerClient is used to connect to event hub
      • Receiver needs to be in a ConsumerGroup - byd default azure creates one ($Default)
      • EventHubConsumerClient is used to connect to event hub as a client
      • ReadEventsAsync to return a PartitionEvent
      • With PartitionEvent there is a property called Data
      • EventBody is sent in bytes, needs to convert that in c#
      • for better throughput use a consumer for partition - recommended by microsoft
        • allowed 5 concurrent readers per partition
        • From EventHubConsumerClient fetch the partition ids with the method GetPartitionIdsAsync
        • With the id list uses the method ReadEventsFromPartitionAsync to fetch events from a given partition
      • package Azure.Messaging.EventHubs.Processor -> live listener of the changes
        • it requires a storage account
        • creates a policy to listen
        • It uses BobContainerClient to connect to the storage
        • EventProcessorClient listen to changes
    • Captures
    • Streaming azure sql database logs
      • in the sql database
      • diagnostics setting
      • destination can be set to even hub
        • subscriptions
        • namespace
        • name
        • policy name

Event hub CLI

Azure cli command to fetch connection string from event hub:

az eventhubs connection-string authorization-rule key list --resource-group-namespace x --eventhub-name y --name RootManageSharedAccessKey

The command line has a trick in it which might confuse the arguments and how to use that. event hub has the command eventhubs to access its power from the shell, often I would miss the s in the end of it.

Azure CLI has the command az eventhubs eventhub, for example, fetching the connection string from event hub as described in the microsoft documentation

API management


  • Check HTTP header - Enforces existence and/or value of an HTTP Header.
  • Get authorization context - Gets the authorization context of a specified authorization configured in the API Management instance.
  • Limit call rate by subscription - Prevents API usage spikes by limiting call rate, on a per subscription basis.
  • Limit call rate by key - Prevents API usage spikes by limiting call rate, on a per key basis.
  • Restrict caller IPs - Filters (allows/denies) calls from specific IP addresses and/or address ranges.
  • Set usage quota by subscription - Allows you to enforce a renewable or lifetime call volume and/or bandwidth quota, on a per subscription basis.
  • Set usage quota by key - Allows you to enforce a renewable or lifetime call volume and/or bandwidth quota, on a per key basis.
  • Validate JWT - Enforces existence and validity of a JWT extracted from either a specified HTTP Header or a specified query parameter.
  • Validate client certificate - Enforces that a certificate presented by a client to an API Management instance matches specified validation rules and claims.

  • API management transformation policies
  • implemented before the request is redirected to the backend
  • or the other way around
  • policies are in xml
  • IP restrictions
    • policies editor
    • under inbound rule
      • ip-filter tag
      • with a tag address inside
  • API management cache
    • caches the requests made to the api