Quantcast
Channel: Neil Parkhurst's Groups Activities
Viewing all 1692 articles
Browse latest View live

MB 600 – D365UG Study Group

$
0
0

If like me you are preparing for the Dynamics 365 & Power Platform Solution Architect exam ….. you might want to join this MB 600 study group that is being hosted by D365UG.

Almost nobody “enjoys” preparing for exams! But learning is critical to our working life so anything that might help you prepare for your MB 600 exam might be a bonus.

You can join our panel as they cover a fresh topic each week. Plus you can ask questions and gain support as needed. We’ve run these study groups before and I believe many people have found them beneficial.

Essentially we will use my MB 600 revision guide to help support each session. You can find my revision guide here. (But we will go off at tangents and maybe dive deeper into some topics as requested.)

Our team of moderators are all seasoned Power Platform experts and very active in the Dynamics 365 community. I think at least three of our panel have received the prestigious annual D365UG “All Star” award for contributions to the Dynamics 365 community. Plus several are Microsoft MVPs and Microsoft Certified Trainers (MCT). This “dream team” should be very well placed to collectively answer any question which might arise. Our little band includes;

  • Beth Burrell (MCT),
  • Heidi Neuhauser,
  • Todd Mercer,
  • Kylie Kiser (MVP),
  • Peter Gulka,
  • Nick Doelman (MCT & MVP),
  • Malcom McAuley,
  • And also me …. Neil Parkhurst (MCT & MVP)

These webinars are free to attend all you need is a D365UG profile.

The webinars will be recorded. But in order to access the recorded content, you will need to be a D365UG membership.

Week one was last week! So you might have missed that one … but you can still register for as many of the future sessions as possible. It really doesn’t matter if you can’t make every week. Just join as few or as many of the webinars as you can.

Hopefully you can join us for at least some of these study sessions. And  I wish you all the luck with your exam and look forward to hearing you’ve passed ….


MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Lead design process (Part Two)

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover leading the design process.

A Dynamics 365 / Power Platform Solution architect needs to lead successful implementations, meaning the architect is a key member of the project team. They must demonstrate functional and technical knowledge of Power Platform and Dynamics 365 apps. And beyond that they must appreciate other related technologies that form part of the overall architecture.

This section of the exam is pretty large! I therefore plan to split this information into three posts. This being the second post of three.

In this post I will try to tackle topics under the following headings;

  • Identify opportunities for component reuse
  • Communicate system design visually
  • Design Application Lifecycle Management (ALM)

Identify opportunities for component reuse

App components allow reuse within and across apps. For example, an account form created for your sales app may work perfectly well in your marketing and customer service apps. Whilst you could create a separate form for each app, if it works reuse it!

Additionally careful consideration on what components will be included into your apps can allow multiple people to work on building a single app.

Having common components can help promote consistency in the application. And also reduce redundancy.

Plus component reuse results in a final solution that should be easier to maintain.

Custom components (PCF) can be created which might create reusable features. Look for visuals that would benefit from investing in making it a component! (e.g. headers, common widgets etc.)

Canvas app components are targeted at canvas app makers and can only be used in canvas apps. Professional developers can build components using Power Apps Component Framework (PCF). These components can be re-used across your model-driven apps and potentially also canvas apps.

As the Solution Architect designs classic workflows or Power Automate flows, considering when to use child flows maybe required. Child flows allow us to break out parts of the flow into reusable child flows. If using a child Power Automate that you may need to ensure the “current environment connector” is used.

Communicate system design visually

The Architect will often create diagrams to help identify at a high-level how the requirements will be implemented. This can help as useful guide to aid the implementation team create a detailed design. Therefore using diagrams can help communicate the solution design and data architecture in an easy to understand manner to both the customer and wider development team.

Additionally the creation of an overall solution architecture diagram will help identify any opportunities to create proof of concepts / prototypes. When creating a POC you will typically not be creating the entire system so having a visualization to see how this fits into the wider system maybe very useful.

It will also be common to create diagrams to illustrate the data model, often called Entity Relationship Diagrams (ERDs). These show the entities and how they relate to other entities in the same.


Design Application Lifecycle Management (ALM)

Multiple people will be working on the project at the same time and often will be based in different locations. (For example, some of the developers could be based off shore.)

An agreed approach to deployments and testing environments will be required. Do you need separate environments for development, system testing, user acceptance testing, integration testing etc? In a simple scenario the customer may have all their environments within a single tenant. But it is also common for multiple tenants to exist. Therefore deployments need to be from instance to instance which may or may not be located in the same tenant.

Out of the box the Power Platform does not provide version-controlled tracking. ALM processes will be needed to ensure it is clear what components have been deployed to which instances and in what state.

When we talk about environments it may be useful to consider what is (and isn’t) in an environment. Our environment will contain all the Power Apps, Power Automates and Power Virtual Agents that make up our Power Platform solution. Additionally the environment will contain the common data service and any custom connectors. External to our environment (but maybe connecting to it) will be Power BI, any Azure services and non-customer engagement Dynamics 365 apps. (Such as Finance).

We may also need to consider the location of our environments / data. Microsoft publish details of their business applications and potential locations here.

There are multiple factors that might influence the choice of data location. Including compliance / residency requirements. But also technical constraints such as latency will come into play. Additionally you may find certain applications or features are only available in certain locations. But commonly selecting a location that is close to the majority of users will be preferable. You should also be aware that the location of the tenant (for billing) can be different to the location of the environment and therefore the data location.

Often multiple environments will exist from an ALM point of view. With separate instances for dev, test, prod etc. But other reasons exist such as when you need to isolate data that can’t be co-located. Or if you have conflicting customizations that can’t co-exist. These sorts of scenarios can be common in large organisations with regional business models that differ in approach or maybe have differing compliance requirements.

As a Power Platform Solution Architect we will be expected to lead the establishment of an ALM plan. We may be called upon to evaluate / determine the level of sophistication in our ALM processes that are appropriate for the project. This will no doubt involve working with various teams to support their efforts to implement the selected ALM tools / processes. The Architect may need to consider;

  • Environments– how many and what are their purpose?,
  • source code control -where will the master copy f the solutions and code reside? ,
  • Devops– what is the workflow for developers and how/who will we promote the app from dev to production?,
  • Deployment Configuration– How to configure each environment and what automations can be used to make this process easier?

Traditionally the Power Platform has been environment centric. Meaning changes have been completed in a master environment and promoted from that environment into test, prod etc. An alternatively approach is to be source control centric, meaning the source control system becomes the master. Dev can be re-created from source control, probably via an automated / repeatable process. Changes from dev are therefore checked into source control. Microsoft is encouraging and building tools to support a source control centric ALM approach.

Solutions

Solutions are the containers that are used to track changes made to the common data service, Power Apps and Power Automate flows. Solutions are used to transport and install changes to target environments. The Microsoft Dynamics 365 apps are deployed into the Power Platform as solutions. Additionally 3rd part apps created by independent solution vendors (ISVs) are also delivered as solutions. Plus internally you will create your own set of solutions as required.

Solutions maybe unmanaged or managed. We typically use unmanaged solutions for development purposes and transport them to test / production environments as managed solutions.

As part of your revision I suggest you ensure you are familiar with the differences between managed and unmanaged solutions. And additionally how solution layering operates to provide the final “solution” visible to the end user. I describe solutions in greater detail in this post.

In any project we may decide to use multiple solutions. But it is advised that you only do this if there is a tangible purpose, as multiple solutions can bring complexity. Not least as dependencies can be created between solutions. Ideally solutions should remain independent and you may wish to create separate environments to ensure this is achieved. When using multiple solutions we could opt for a horizontal or vertical split of components. With a horizonal split we’d have solutions for groups of component types. Say a solution for visual components, another for processes & plug-ins, another for security roles etc. With a vertical split on solutions we may group them into functional areas. Maybe with one shared common / base solution and then separate solutions for each key business area.

When we consider which sub components to add into a solution it is considered good practice to only include the required components. Avoiding all the sub components of an entity or all its metadata unless you are making changes to all the components. Including only the changes components will help make deployment of solutions more manageable and will help reduce the chance of unnecessary dependencies.

Solution aware code assets should be built within a build environment. And not in the developers desktop! Code assets will include plug-ins, form script and Power Apps component framework components. After the build they should be deployed to the master environment and will then be exported into the master solution.

Non-solution components

The ALM plan must also consider how to manage any components that will sit outside the Power Platform environment which maybe still be “environment aware”. These could include Power BI visualizations, Azure deployed components and external; integration services.

We may also need to migrate a common set of configuration data from environment to environment. The Configuration Migration Tool can help move data between environments. Importantly it can maintain a common primary record identifier (GUID) for this data. One good example of this which I have commonly experienced is Unified Service Desk (USD). Our USD config is simply data used to describe how the user interface should behave. The USD config can be moved using the configuration migration tool.

Environment variables may also be used and tracked as solution components. The environment variables can be of types decimal, JSON, test and two options. Each can have a default value and a current environment value. Apps, Power Automate and developer code can retrieve and modify the values of environment variables.

DevOps

Azure DevOps (formerly known as Visual Studio Team Services) provides development and collaboration tools. Including Azure Boards, Pipelines, Repos, Test plans and artefacts.

  • Azure Boards– plan, track, and discuss work across your teams
  • Azure Pipelines– Use to automate CI/CD builds and releases
    • Build pipelines are used to create dev environment, commit changes from dev to source control, check solutions and automate testing
    • Release pipelines are used to migration solutions from build pipelines into test/prod.
  • Azure Repos– Source Control to store and track changes
  • Azure Test Plans– Plan, execute, and track scripted tests
  • Azure Artefacts– publish solutions built by build pipelines

Azure DevOps is not the only tool available to us! CDS / admin APIs, direct Power Shell could be used instead for build tasks. Or Power Automate can be used with platform admin connectors to automate deployment tasks.

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Lead design process (Part Three)

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover leading the design process.

A Dynamics 365 / Power Platform Solution architect needs to lead successful implementations, meaning the architect is a key member of the project team. They must demonstrate functional and technical knowledge of Power Platform and Dynamics 365 apps. And beyond that they must appreciate other related technologies that form part of the overall architecture.

This section of the exam is pretty large! I therefore plan to split this information into three posts. These being the third post of three.

In this post I will try to tackle topics under the following headings;

  • Design data migration strategy
  • Partition features into apps
  • Design visualization strategy
  • Design for upgradeability
  • Create functional design documents

Design data migration strategy

The Solution Architect should lead the data model design and alongside that should consider what data should be migrated into the new system.

Data may need to be migrated into Dynamics 365. This maybe a one off task, as data from older systems being replace may need to be imported at the start of the project. Or the migration could be an ongoing task, maybe because you need to synchronise key data attributes between systems.

Tip:
When considering a potential “ongoing” data migration / integration it may be useful to consider if a virtual entity could be used rather than duplicating the data!

Data quality– if you plan to import data from a legacy system you may need to consider data quality. And if any existing data issues need to be handled. For example are the addresses held in the old system adequate. Or are there missing mandatory fields. Or are phone numbers incorrectly formatted.

Duplicates – duplication of data can be a significant issue with “CRM” systems. Do you have large numbers of duplicate leads or contact records. And if so, should these duplicates to resolved prior to migration.

Data retention– Do you really need to import all of the data from the legacy application? Especially as storage is not free! But there might be requirements to keep certain entities for a given period.

Data Transformation– The data in the legacy system may be held in a different format to that required in the new solution. Therefore the process of migration may not be a simple import. Transformation logic maybe need as part of the import.

Tools – what tools will be used to migrate the data. The Power Platform does include out of the box features to import data. But with large complex migrations you may need to consider using 3rd party tools such as KingswaySoft of Scribe.

Partition features into apps

A Dynamics 365 or Power Platform solution is typically made up of one or more apps. Each app will deliver a logical group of functionality. (Be that via a Canvas App or a Model Driven App.)

The Architect should look for logical functional groupings to help partition the features into apps. This might be based on job functions or areas of the business such as “Customer Service”, “Sales”, “Marketing” etc.

Out of the Box Dynamics 365 ships as a number of model-drive apps. When appropriate you can leverage those apps as shipped. For example, I have often implemented the Marketing app with little change. In other circumstances the out of the box app groupings may not fit your organization. Then consider creating a custom app which may include attributes of several out of the box apps. You can read about the model-driven app design here.

One advantage of using an out of the box app is users would see all new updates as they become available. But equally the out of the box apps may include features you don’t need! Composing a custom (new) app does give you complete control over the options the users see but you will have to manually add new “things” as they become available.

The Solution Architect may need to know when to start with a model-driven app, when to implement a canvas app or when a portal may be more appropriate. So understanding the differences could be important! To help I have highlight some features of each below;

Model-Driven AppsCanvas AppsPortals
CDS data drivenNot CDS data driven (Can leverage external data via connectors)CDS data driven
Backoffice / process focusedTask focused appsExternal user focused
Responsive / consistent UIVisual presentation of informationWeb application
User personalizationCustom UIUse model-driven forms and views as framework to surface CDS data
Consistent accessibilityDevice integration (e.g. Camera usage)Can be customized with standard web technologies (HTML, JavaScript, CSS etc.)
User tooling (Excel etc)Basic offline support
Customizing existing first-party appsSharePoint or Teams embedding
Data relationships drive navigation
Automatic security trimming of UI

The decision of which approach to use in your solution may result in a hybrid approach. This isn’t a one size fits all scenario! For example, you may use a model-driven app for your admin functions, a canvas app for the user facing elements and a portal for external access. Therefore one solution may comprise of any combination of app types.

Additionally don’t forget that you can embed Canvas apps into model-driven apps. This might allow the introduction of visuals and connectors into a Model-driven app based solution which might otherwise not be able to leverage these capabilities. But I advise you to remember that “pretty” apps are nice but performant apps receive better user adoption. So carefully considering the best approach for your solution is advised.

When deciding how many apps should be included in your solution or what type of app to use, there are a number of guidelines you may wish to consider. Including;

  • Large monolithic apps should be avoided
  • Too many small apps are jarring to the user if they have to switch context frequently
  • Components can be used by multiple apps, allow composition of apps that target users with specific needs.
  • Offer groups of users targeted mobile apps to save time when away from their desk

Design visualization strategy

Creation of visualizations covering user screens, reports and other insights will be an essential task of the solution architect. This process should be on going. Right from pre-sales into project initiation and beyond into the analysis, design and implementation phases.

Often customers will focus on the user experience. Understanding if the users are always, sometimes or never mobile when influence the UI design.

Wireframes maybe needed to show main forms, dashboards, mobile experiences and other visualizations such as Power BI.

We may need to look for opportunities to use proactive insights or AI in place of a more reactive use of traditional reports and analytics. (Although our designs will also often include traditional reporting mechanisms!)

There are many points to consider when designing visualizations, including;

  • Who will consume the reports? (Are they already Power Platform users?)
  • What data is required?
  • How fresh does the data have to be?
  • What data is required that might be external to our Power Platform / CDS solution?
  • Can existing reporting capabilities or “insights apps” be leveraged or is something custom required?
  • What actions to we expect users to need to take in response to information in reports?
    • Is the action something we can predict or automate?

Reporting could be grouped into three categories. Operations reports, self-service BI and Enterprise BI. Each of these could include the following …

  • Operational reports
    • views,
    • charts
    • dashboards,
    • SSRS reports
    • embedding of Power BI
    • Advanced find,
    • Excel / Word Templates
    • And maybe 3rd party reporting tools.
  • Self-service BI
    • manual exports of data into Excel,
    • Access to Power BI service (with data refreshed from CDS, maybe on a schedule).
  • Enterprise BI
    • the data export service into Azure SQL,
    • Web APIs for data extraction, transformation and loading . (ETL),
    • Azure Data Lake Storage,
    • AI Builder.

Operational reports are typically easy to use, contain current data and can be accessed by most users with standard technical skills. However they will also commonly represent simple visualisations with only basic filtering and a limited access to historic data.

Self-service reports would typically be created by “power users” with more advanced reporting requirements. Often Power BI maybe leverages to allow them to discover, analyze and visualize data. When considering self-service reporting requirements the Solution Architect may need to ensure the required base transactional and reference data is available. Plus any data security concerns are addressed. (For example, within Power BI CDS user security roles and hierarchy are not used.)

Enterprise BI maybe used to reduce the load on operational data, by shifting any “heavy lifting” into Azure Data Lake. Typically enterprise BI may need access to potentially large volumes of historical data.

The AI Builder can provide pre-built models and intelligence you train. It can provide components which can be included directly in Canvas Apps and Power Automate. Or data can be created for use in Model-driven apps. The low-code AI capabilities include;

  • Prediction
    • Binary classification – predict and classify field in CDS.
  • Vision
    • Forms processing – extract structed data from digital paper, PDFs and forms.
    • Object detection – detect objects through camera or image control. (may need training!)
  • Language
    • Text classification – classify, group and categorize any text in CDS

Additionally Microsoft Azure Cognitive services provide a family of AI services and cognitive APIs to help developers build custom intelligence into apps. You can read an overview about Azure Cognitive Services here.

Plus Solution Architects should be aware that Microsoft Azure Machine Learning can allow developers to implement enterprise grade AI scenarios not met by the AI Builder or Cognitive Services. You can read about Azure Machine Learning here.

Design for upgradeability

Maintaining custom code is more expensive that features delivered with simply low-code configurations to out of the box features. Additionally Dynamics 365 and the Power Platform are updated frequently. Therefore the Solution Architect should design solutions which are as easy to maintain as possible and created in such a way that the regular updates do not break the solution. Additionally the Architect should be responsible for ensuring the correct level of documentation is created so that future maintenance is easier.

Pushing the Power Platform beyond its “natural” capabilities and using unsupported customization techniques should be avoided. Using any unsupported approaches always increases your “technical debt” and will no doubt result in a system that is harder to upgrade.

Consistency across customizations is important. As this will aid the future maintainability of a solution. Additionally any custom software extensions should also follow common design principles.

Plus, consistency across the user experience should be maintained. Creating a consistent interface that sticks to the same layout and naming conventions will ultimately create an improved user experience and an application users are happier to use.

Also, consider that some attributes within the Power Platform are hard to change later. One example being the publisher prefix which gets applied to all schema names. We have all seen the odd “new_” name creep into out applications, this should be avoided! Consider carefully how to name entities and fields, as changing schema names later is “difficult”.

Create functional design documents

A functional design will describe how the requirements to be addressed by the solution will be delivered. It is often a formal document used to describe the intended capabilities, appearance and user interactions in detail.

Functional designs can take many forms. In fact most organisations I’ve worked with have had their own templates defining what should be included.

The functional design maybe a single document but equally the design could be expressed as numerous user stories. A single functional requirements document (FRD) often has the advantage that it makes tracking the requirements really easy. If all the requirements are covered by one design document it should be easy to “tick off” that they have all been included. Multiple user stories however tend to aid planning, as each story can be typically be delivered in one iteration or sprint. User stories however can lack the overall detail needed to help illustrate the bigger picture of what is being delivered.

Whatever template / approach you follow  the purpose of the functional specification is to capture what the software needs to do to support a business user. The functional specification could be considered the point at which the business meets IT! As often we’ll see the business users review the functional design to confirm their requirements have been met and also the developers will use the functional design documents as a key input into their technical design.

As with other aspects of the project life cycle, I would expect the Solution Architect to be closely involved with the creation of and review of the functional design. But that does not suggest that they will create the entire design. (Business analysts and other parties may also contribute to the functional design.)

Functional documents will differ from technical specification / detailed design documents are they are based on the business requirements. They should contain details of end-user expectations and will therefore become a basis for the technical designs which will follow. I personally like to think of it like this …. The business requirements define what end result is needed, the functional design defines what will be delivered, the technical designs will define how it will be achieved.

There are potentially (but not always) multiple documents that could be considered as being part of the functional design. You may hear the acronyms BRD, SRS or FRD used …

  • Business requirements document (BRD)– this described the needs we are trying to fulfil by developing this solution.
  • System requirements specification (SRS)– describes the functional requirements (and non-functional requirements) plus any use cases (stories) that the solution must fulfil.
  • Functional requirements document (FRD)– The FRD would be a detailed definition of everything expressed in the BRD and SRS.

Can we unlock Source Campaign in opportunity form

$
0
0

Hello Everyone,

Right now the only way to attach an opportunity to marketing campaign is via the response feature in marketing campaign. However, the problem with that is you have two do two extra steps to create and attach a marketing campaign to an opportunity. Is there way To unlock Source Campaign fields and attach straight from new opportunity form? Instead of going to marketing campaign. Thank you very much

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Design data and security model (Part One)

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover topics around designing the data and security models.

Data is a big topic! I therefore plan to split this information into two post. In this first post I will cover an overview of security and data design.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. In terms of the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!


Security Overview

As a Solution Architect we need to consider if there are any security, regulatory or compliance requirements that will impact the solution design. Unfortunately, I have often seen security handled as an afterthought but the Solution Architect should consider security throughout the entire application lifecycle. From design and implementation to deployment and even ongoing into operations.

A discovery phase should review / document existing security measures. This is because a single project is unlikely to change the entire approach to authentication. You should understanding if single sign is already in place. Or fi the customer is using 3rd part authentication products or “just” Azure active directory. And if multi-factor authentication is in place.

It is also important to consider how the organizations structure may influence security models. To give just one example, I once worked with a large insurance company. For them it was critical that the data held by insurance brokers was kept isolated from records held by the insurance underwriting teams. These types of organizational structural requirements could lead the Architect to conclude multiple business units, environments or even tenants are required.

The Architect may need to review / design multiple layers of security. Including;

  1. Azure AD conditional access– blocking / granting system access based on user groups, devices or location.
  2. Environment roles– include user and admin roles (such as global admin). You can read about 365 admin roles here,
  3. Resource permissions for apps, flows, custom connectors etc.
  4. Dataverse (aka CDS) security roles– control access to entities and features within the Dataverse environment.
  5. Restrictions to the 300+ Dataverse connectors– data loss prevention polices (DLP) used to enforce how connectors are used.

You can read about restricting access here.

Security Groups

Within the Power Platform admin center we can optionally associate a Dataverse environment with a security group.

Whenever a license is assigned to a user, by default a user record would be created in all enabled Power Platform environments within the tenant. This could effectively grant them access to all environments within the tenant.

Security groups can be used to limit access to environments. Then only users who are added to the associated security group will be added to the environment as a user. If a user is ever removed from the security group they are automatically disabled.

Note:
If a security group is associated with an existing environment all users in the environment that are not members of the group will therefore be disabled.

Tip:
Whilst security is massively important when presented with a requirement to restrict access to data it is worth questioning if this is really a security requirement. Or is it just filtering of data for convivence? By doing this you should create a solution which is secure but does not include any unnecessary boundaries.

Data Overview

Additionally you may need to consider any specific requirements around data storage. How long must data to retained? Does it need to reside in a specific country? Are there any laws that apply to how the data can be stored / used?

Sometimes regulatory requirements may also impose specific service level agreements or turnaround times. Or dictate that specific parties / governing bodies must be kept informed or involved in certain transactions.

Data should always be considered as a valuable asset. Therefore designing a security model to ensure the proper usage and access to that valuable asset is paramount. Features like Azure Conditional Access and Data Loss Prevention Policies can be enabled. Additionally ensuring proper usage of secrets / certificates for the services that access the data maybe essential.

Below you can see a diagram which highlights the layers of security which are available. When working with Dynamics 365 maybe an Architect tends to have a focus on the security roles found in the Dataverse. But it should also be noted that beyond these roles additional layers of security exist. For example, to give condition access to Azure or restrict access to connectors.

There are numerous standards for data compliance covering industry certifications, data protection and physical security. During the requirement gathering phases the Solution Architect should question which standards are applicable and may need to confirm compliance. Some examples include ISO27001, GDPR etc. Microsoft publish details of various standards and their compliance here.

Design entities and fields

The Solution Architect should lead the data model design. With model-driven apps …. It is not uncommon for my design work to actually start with what data is to be stored and build out from there. Therefore establishing a high-level data architecture for the project can be an essential early task.

It maybe common for the Solution Architect to design the data model at a high-level before other individuals extend their design. For example, the architect might define the core entities and their relationships but maybe the fields within those entities will be defined later by the design team. If this is the case the Architect would still need to review these detailed designs and provide feedback as the detailed data model evolves.

The Dataverse includes the Common Data Model! This is a set of system tables which support many common business scenarios. The Common Data Model (CDM) is open-sourced in GitHub and contains over 260 entities. Many systems and platforms implement the CDM today. These include Dataverse, Azure Data Lake, Power BI dataflows, Azure Data services, Informatica and more. You can find the CDM schema in GitHub here.

In addition to the industry standard CDM schema, Microsoft provide industry specific accelerators aimed at particular vertical markets. Examples include, Healthcare, Non-profit, Education, Finance and Retails. ISVs may then create industry specific apps which leverage these accelerators. You can find out about the accelerators here.

Whenever possible the standard system tables within the Common Data Model should be used. For example, if you need to record details about customers use the account table for that. This will not only make the system quicker and easier to develop but will aid future maintainability. All Architects should avoid re-creating the wheel!

It will be common to create diagrams to illustrate the data model, often called Entity Relationship Diagrams (ERDs). These show the entities (aka tables) and how they relate to other tables.

In my ERDs I like to highlight which entities are out of the box with no change, which are leveraging out of the box tables but with additional custom fields and which are completely custom.

Typically we will be thinking about tables and columns held within the Dataverse (CDS), conceptually it might be easy to think of Dataverse as a traditional database. But the Dataverse is much more than that! As it includes a configurable security model, can support custom business logic that will execute regardless of the application and even stores data differently depending it type. As relational data, audit logs and files (such as email attachments or photos) are all stored differently.

Sometimes, rather than depicting the tables in an ERD the Solution Architect may first create diagrams to illustrate the flow of data within the solution. Without needing to “worry” about the physical implementation. These diagrams are known as logical data models. Only once the logical data model is understood would the Architect then create a physical data model based on the logical model. The physical data model could take the form of an ERD and could include data within Dataverse, Azure Data Lake, connectors or other data stores.

There are several strategies / techniques that can be used to help the Architect when creating a data model;

  • Always start by depicting the core tables and relationships– having a focus on the core tables will avoid getting side-tracked into smaller (less important) parts of the solution.
  • Avoid over normalization– people with a data architecting background may tend to try and build a Dataverse data model with traditional SQL database concepts in mind. A fully normalised database within the Dataverse may have an adverse impact on the user experience!
  • Start with the end in mind– it is often useful to start off by defining the final reporting needs, you can then confirm the data model adequately meets those requirements.
  • Consider what data is required for AI– if you intend on implementing an AI element into your solution design consider what source data will be needed to support any machine learning / AI algorithms.
  • Plan for the future– consider the data model requirements for today but plan for how this might evolve into the future. (But avoid trying to nail every future requirement!)
  • Use a POC– creating a proof of concept to demonstrate how the data might be represented to users can be a valuable exercise. But be prepared that this might mean trying a data model and then throwing it away and starting again.
  • Don’t build what you don’t need– avoid building put parts of the data model you don’t plan to use. It is simple to add columns and tables later, so add them when you know they are required.

Once the tables within your solution design have been considered you will need to consider the detailed design task of creating columns (aka fields). Columns can be of many different data types, some of which have specific considerations. I will mention just a few here;

  • Two options (yes/no)– when you create these ensure you will never need more choices! If unsure maybe opt for a “Choices” column.
  • File and image– allows the storing of files and images into the Dataverse.
  • Customer– a special lookup type that can be either a contact or account.
  • Lookup / Choices (Optionsets)– which is best for your design! Optionsets (now known as Choices) make a simple user experience but lookups to reference data can give more flexibility to add options later.
  • Date / Time – be careful to select the appropriate behaviour. (local, time zone independent, data only)
  • Number fields– we have many variations to select from. Choose wisely!

Other options exist for storing files / images. Having images and documents in the Dataverse might be useful as security permissions would apply and the user experience to complete upload can be “pleasing”. But size limits do apply so storing large files might not be possible. Other options like SharePoint, which is ideal for collaboration exist. Or you could consider storing the files in Azure storage which might be useful for external access or archiving purposes. As part of your revision …. you may need to be aware of the pros / cons of various methods to store files!

Design reference and configuration data

When we create an table in the Power Platform there are some key decisions to make about how the table is created. Some of these cannot be easily changed later! For example, should the rows be user / team owned or organization owned.

User / team owned records have an owner field on every row in the table. This in turn can be used to decide what level of security is applied for the owner and other users. One good out of the box example of a user owned table might be the “contact” table. Each contact in the system is owned by a single user or team. It might then be that only the owner can edit the contact but maybe everyone can see the contact.

Alternatively tables can be organisation owned. With these you either get access or not! The record within the table are often available to everyone in the organization. These tables are ideal for holding reference / configuration data.

Often one consideration when designing reference data is to consider if a Choice (optionset) or a lookup to an organization owned table is the best approach. I find “choices” most useful for short lists which rarely change. Whilst lookups are ideal for longer lists that might evolve overtime. (As users can be granted permissions to maintain the options available in a lookup. But as the Choice column forms part of your solution the development team would need to alter the items in a Choice column.

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Design data and security model (Part Two)

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover topics around designing the data and security models.

Data is a big topic! I therefore plan to split this information into two post. In this second post I will dive deeper into complex scenarios and discuss the more technical aspects of Dataverse (CDS) security.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. In terms of the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!


Design complex scenarios

Often a Dynamics 365 application will simply access data that resides in the Dataverse. However Power Automate, Power BI and Power Apps (Canvas Apps) can leverage data connectors to access data from many hundreds of sources. Additionally custom connectors can be created as required. These connectors allow us to leverage existing data source and services.

When an “off the shelf” connector does not exist a custom connector can be created. This might work by making use of an existing API. Or you may need to create a custom API and define your own actions. The connectors can be make use of OAuth (including Azure AD), API key and basic auth. Connectors can be packaged and deployed via solutions. Creating as connector that is then available by users for re-use is a great way to empower then to extend the solution using the Power Platform.

Data modelling on the Power Platform should therefore look at the whole data architecture picture and include a logical look at data from the Dataverse, Data Lakes and external sources using connectors.

Azure Data Lake is a hyper-scale repository for big data analytics. Azure Data Lake can store data from all disparate sources, including cloud and on-premise sources of any size and structure. The Data Lake can include CDM and can be integrated with Dataverse. The huge scale of the Data Lake may help support complex scenarios like machine learning, AI and complex reporting when extremely large data volumes maybe required.

  • Dataverse– used for transaction data that apps will consume and maintain. Structure typically based on the common data model.
  • Azure Data Lake– ideal for data from other / multiple systems, read focused and can leverage the common data model.
  • Connectors– great for leaving the data where it is. Supports accessing external sources to make their data available in apps.

There are many drivers which may influence data model decisions, including;

  • Security requirements
  • User experience– it’s easy to forget that as we add normalization and relationships we create new constructs users need to navigate to be successful
  • Data location– some data must be stored in a given GEO
  • Retention policies– not all data can be stored indefinitely!
  • Self-service reporting– if the data model becomes complex can a “regular” user still extract the data for their reporting purposes? Or will all reports end up needing a data architect to design them??
  • Roadmap – what are the plans for the future?
  • Existing systems– are we going to connect with any existing systems or does any existing data need to be “squeezed” into the new data model.
  • Localization– Multi-region, multi-lingual, multi-currency requirements

Custom Activities

Out of the box we have activities like phone call, email and task. But custom activities can be created as required. The advantage of creating a custom activity is that they show in the timeline lists with other activities. But you need to be aware that security is applied consistently across all activities. So if a user is given the ability to delete tasks, this would actually mean they can delete any activity type. (Including your custom activity.) Activities can be regarding any table that is enabled for activities, meaning a custom activity could be regarding “anything”. (Although you could consider some control by limiting which entities are available to users in your model-driven app!)

Calculated and Rollup Fields

The Dataverse provides us with options to create calculated or rollup fields.

Calculated fields are populated on the retrieve of records. They are read-only. They can be based on fields within the table or its parent.

Rollup fields are stored on the table. Like calculated fields they are read-only, they are calculated (re-calculated) based on a update schedule or on demand. The values on the parent are rolled up from child records. (1:N only!) Filtering on the related table can be applied.

In complex scenarios …. it is possible to include rollup fields in calculated fields. And it is possible to rollup “simple” calculated fields.

Relationships

Often we will work with one to many (1:N) relationships. So one order can have many order lines etc.

There are situations when you may need a many to many relationship (N:N). For example one patient could be seen by many doctors. And at the same time each doctor would have many patients.

Deciding when an N:N relationship is needed can sometimes be harder than you’d think! Creating a POC of your data model can help identify many to many relationship requirements.

Tip: If you aren’t fully familiar with the types of relationships in the Dataverse and the possible cascade logic then some additional revision into relationships may be beneficial.

You may wish to review this post in which I explain table relationships.

Alternate Keys

You can also define alternate keys. Alternate keys can contain decimal, whole number, text fields, dates and lookup fields. Entities can have up to 5 alternate keys. An index is created behind the scenes which is used to enforce uniqueness. An alternate key can be made up of multiple fields but its total length cannot exceed 900 bytes or 16 columns per key.

Alternate keys are often useful for fields like account number (etc) that may be primary references supplied to / from external systems. And would therefore need to have uniqueness strictly enforced.

Design business unit team structure

Business Units are a fundamental building block in the Dynamics security model. They define the structure of the organization.
Business units provide a great way to manage record access for large numbers of users with permissions being granted to the current business unit and / or its children.

Therefore the first security construct to consider are business units, these allow you to define the structure of your organization. Users, teams and security roles will be linked to business units making them a fundamental building block in the security model. Business units can have a hierarchy. This might reflect the organizations actual hierarchy but that isn’t always the case. In actual fact the hierarchy is designed to “drive” which users get access to what records.

Each business units has a default team that includes all users assigned to that business unit. Each user can only be in one business unit but they can be added to additional teams as required.

Teams have several uses in Dynamics 365 / Power Platform. Teams are essentially lists or groups of users and can be used to help manage security. Teams may also be useful when creating reports. Additionally, teams can be useful when you wish to share “assets” with a group of users. (Assets could include views, charts, dashboards or records.)

Owner teams within the Dataverse can simply include lists of users. But a Dataverse team can also be connected to an Azure Activity Director Office Group or Security Group.

We also have a concept of Access Teams. Access teams are dynamically created on a per record basis. The access granted to the members of the team to the specific record is based on an access team template.

If you need to revised the concepts of business units and teams in greater depth, try reading my post on the subject here.

Tip:
when designing the data model consider its manageability. Creating large numbers of busi
ness units and security roles may give granular control of security options but could also result in a solution which is a nightmare to manage. You should use business units to restrict access as required but do not be tempted to simply match the organisations structure.

Security Hierarchy

An additional approach to security is to use a security hierarchy. With this approach a manager is granted access to the records of their team members. As a multi-level team structure could exist the permissions can be inherited through multiple levels. This way a manager can read records for their direct reports and further into the hierarchy. The manager however can only maintain records owned by their direct reports.

The security hierarchy can be based on a concept of manager or position. With manager the manager field from the systemuser record is used. With a position approach custom positions are defined and users assigned to each position. (The position approach allows crossing of business unit boundaries.)

When defining the hierarchy settings we can decide how many levels to target. The number of levels implemented should typically be kept as low as possible, maybe 4 levels. Although the theoretical maximum number of levels is 100.

Note: You cannot combine manage and position hierarchies at the same time.

Design security roles

Security roles within the Dataverse provide users with access to data. As a general principle users should only be granted access to data that they really need to access.

The Solution Architect will need to decide what approach should be taken to security roles. Dynamics 365 does ship with a number of out of the box security roles, commonly these will be copied and then amended as required. There are some common strategies for building security roles;

  • Position specific– each person who holds a given position is given one specific role. (All users have one role.)
  • Baseline + position– a baseline role is created which grants access to the entities needed by all users. Then an additional role is added specific to the additional requirements for a given position. (All users have at least two roles.)
  • Baseline + capability– again a baseline role is created. Then roles for each capability / feature that might be required. (All users would have multiple roles.)

Tip:
As part of your revision you may wish to consider the advantages and disadvantages of each of the strategies above. Additionally maybe review how security is achieved in your current organization!

Each role is linked to a business unit. The roles then contain many options that govern the access to entities and features for the combination of business unit and user (or team).

Often roles will be inherited from a parent business unit. Meaning when I use the manage roles on a user linked to a child business unit, I still see the roles from the parent business unit. This happens even though I haven’t created any roles specific for the child unit!

Each user (or team) can have multiple roles. The Dynamics 365 security model works on a least restrictive security model. Meaning a “sum” of all the roles assigned to the user would be applied. Additionally when a role is assigned to a team, we can select if the privileges apply to “just” the team or if they get applied directly to the user. Meaning members of the team inherent the privileges as if the role is applied directly to their user record.

Each role is made up of several options (actions) for each table / feature in Dynamics 365. Entities typically have multiple actions covering functions including, create, read, write, delete, append, append to, assign and share.

Each table can be organization owned or user / team owned. Organization owned entities have to security privileges, meaning each user either is isn’t granted access. Whilst user / team owned entities support access be granted to the user, business unit, child business units or organization.

If you need to revised the concepts associated with security roles in greater depth, try reading my post on the subject here.

Design column (aka field) security

Often it will be sufficient to only control access to an table. But it maybe that a particular column on a table contains sensitive data that only certain users can see and maintain. We can hide columns on forms but this does not actually secure the data.

In these scenarios Dynamics 365 and the Power Platform supports column level security.

For example you may store information on contact about their disabilities / special needs. A limited number of users may need access to this information but maybe most users don’t need to see what disabilities a person might have.

You can read about how this operates here.

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Design integrations

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover my revision connected with designing integrations.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. In terms of the exam I would expect them to begin using the term Dataverse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!

The Solution Architect will be required to identify when integrations maybe required. They will lead the design of any integrations and document how they will be included into the overall architecture. It will be the Architect’s role to ensure any integrations do not make the solution “fragile” and may additionally need to consider integrations as part of an overall disaster recovery plan.

There are many reasons we need integrations. Integrations maybe required to provide a consistent interface for users / customers. Maybe real-time integrations are needed to keep disparate systems up-to-date. Often integrations may be required to avoid reinventing the wheel! Reusing an existing system or component with the help of integrations maybe be more cost effective and achievable than reimplementing.

A Power Platform Solution Architect must have a deep technical knowledge of Power Platform and Dynamics 365 apps plus at least a basic knowledge of related Microsoft cloud solutions and other third-party technologies. When designing integrations the Solution Architect will review the requirements and identify which parts can leverage a Dynamics 365 apps and which parts must be custom built by using the Power Platform, Microsoft Azure etc.

Historically an architect might have started with custom development in mind but a Power Platform Solution Architect will typically start their design process with a focus on Dynamics 365 and the Power Platform and only later use Microsoft Azure and other custom approaches to address any gaps.

When considering the design of integrations it maybe worth thinking about how the Dataverse operates. The Dataverse is a software-as-a-service (SaaS) platform, therefore most of its architecture, such as underlying data storage, are abstracted from developers so they can focus on other tasks such as building custom business logic and integrating with other applications.

Note:
Historically we may have used SOAP API to programmatically retrieve data from the Dataverse (CDS). But these days the web API is the preferred approach. If you aren’t already familiar with the web API approach you can read about about it here.

Most solutions do not exist in isolation, they rely on internal and external integrations. A part of identifying solution components the Architect should highlight how any integrations will be handled. We may need to define what tools or services will be used to complete the integrations. And also potentially define clear boundaries on where one system ends and another begins. The Solution Architect should focus on what happens on these boundaries. Often multiple parties will be involved in the development of integrations so clearly defining which boundaries are “owned” by which supplier or internal development teams maybe critical.

Integrations can take many forms.

Data integration– “just” combining data from different sources. Maybe to provide the user with a unified view. Data integrations maybe event base (near real-time) or batch based (maybe with overnight updates).

Application integration– a higher level integration connecting at the application layer.

Process integration– potentially you retain multiple disparate systems but each of those systems remain part of the overall business function.

Design collaboration integration

Often integration may involve collaboration tools such as Outlook, Teams, Yammer and more.

SharePoint for example maybe leveraged for document storage, whist Teams can be created to help groups of users collaborate.

Often the associated collaboration tools will sit outside of the Power Platform and Dynamics 365.Meaning each tool would have its own requirements / constraints in terms to license costs, security considerations and more.

Design Dynamics 365 integration

There are a number of different types of integration / extensibility options available to us when considering the integration of Dynamics 365 applications.

Some of these capabilities are baked into the Power Platform, including business rules, calculated fields, workflow processes and much more. Client-side extensibility can also be created with JavaScript.

Additionally developers can access the transactional pipeline with plugins. (using .net SDK) Or custom workflow activities. As mentioned we can access the Dataverse (CDS) API service using SOAP or web API (OData).

Custom business logic can be created via plugins. This custom logic can be executed synchronously or asynchronously. The results of synchronous calls can be seen by the users immediately but this does mean the user is “blocked” until the call completes. Whilst with asynchronous calls will not block the user. However asynchronous calls can only run post operation. (Synchronous calls can run post or pre-operation.)

Synchronous calls are included in the original transaction but are limited to 2 minutes. Whilst asynchronous customizations would not be part of the original transaction.

FYI: When you use the Dataverse connector in Power Automate or a Canvas App it will be making calls to the OData API. (Power Automate runs asynchronously.) Classic workflows can run synchronously (real-time) or asynchronously.

Any custom logic may run on the client or the server. Examples of client side extensibility would include canvas app formulas, JavaScript on model-driven forms, business rules (with form scope) and the Power Apps component framework. Client side customizations happen in the user interface, meaning the user will see the results immediately. But as in the user interface any customizations will normally only be enforced in the specific client applications.

Whilst server side customizations only happen when the data is sent to the server. Meaning users would only see the results when a data refresh is completed. Server side customizations can be created with plug-ins, Power Automate flows, classic workflows and business rules (with entity scope).

To ensure consistent availability and performance for everyone the platform applies limits to how APIs are used by the Common Data Service. Service protection API limits help ensure that users running applications cannot interfere with each other based on resource constraints. These limits should not affect normal users of the platform. You can read an overview of API limits here.

API requests within the Power Platform consist of various actions which a user makes. Including;

  • Connectors– API requests from connectors in Power Apps and Power Automate
  • Power Automate – all power automate step actions result in API requests
  • Dataverse (Common Data Service)– all create, read, update and delete (CRUD) operations. Plus “special” operations like share and assign.

You can read more about APIs and their limits here. Because limits exist the Solution Architect may need to optimize integrations to minimize API calls. Additionally data integration applications may need to handle API limit errors, this is done by implementing a strategy to retry operations if an API limit error is received. You can read about the service protection and API limits here.

Design internal system integration

Internal app builders / customizers maybe able to leverage low code approaches to support internal integrations. Maybe these integrations can be created using canvas apps or power automate flows with standard or custom connectors. Additionally canvas apps maybe embedded into model-driven apps to leverage connectors.

Power Automate maybe used for event based data integrations especially if integrations are needed between multiple Dataverse environments.

Design third-party integration

Developers maybe required to assist with 3rd party integrations. Maybe this will involve creating custom APIs, developing plug-ins or using other technologies such as Azure Logic Apps or Azure Service Bus.

Virtual entities may also give visibility of external data sources directly in model driven apps. Or maybe custom UI controls can be created to surface 3rd party data using the PowerApps Component Framework (PCF).

With inbound data integrations 3rd party tools such as KingswaySoft or Scribe maybe used. When using such tools performance / throughput may need to be considered, you possibly need to design multiple threads to overcome latency effects.

Web application integrations maybe created using webhooks. Where custom call backs maybe triggered using JSON in a http post request. Webhooks maybe synchronous or asynchronous.

Design authentication strategy

The systems into which integrations are required may demand differing approaches to authentication.

When working in a batch mode for updating data the timing of authentication requests may also be a consideration which could impact performance. Maybe you shouldn’t authenticate on every request!

OAuth

OAuth is a standard that apps can use to provide client applications with “secure delegated access”. OAuth works over HTTPS and authorizes devices, APIs, servers, and applications with access tokens rather than credentials. Authentication integration maybe provided with OAuth providers such as Azure Active Directory, Facebook, Google, Twitter and Microsoft accounts. With basic authentication the user would always have to provide a username and password. With OAuth the user sends an API key ID and secret.

Design business continuity strategy

In terms of availability and disaster recovery the Power Platform components should handle concerns with internal integrations. Therefore the Solution Architect may need to focus on external integrations. Additionally manual environment backup / restore operations can be used to help with “self-created” problems like bad deployments or mass data corruptions.

Separation of applications can reduce their reliance on each other. Using Azure Service Bus may allow integrations between systems in a decoupled manner.

Design integrations with Microsoft Azure

Webhooks can only scale when the host application can handle the volume. Azure Service Bus and Azure Event Hubs maybe used for high scale processing / queuing of requests. Although the Azure approach can only be asynchronous. (Webhooks can be synchronous or asynchronous.)

Azure Service Bus

CDS supports integration with Azure Service Bus. Developers can register plug-ins with Common Data Service that can pass runtime message data, known as the execution context, to one or more Azure solutions in the cloud. AzureAzure Service Bus integrations provide a secure and reliable communication channel between the common data service runtime data and external cloud-based line-of-business applications. You can read more about Azure integration here.

Azure Service Bus distributes messages to multiple independent backend systems, decoupling the applications.

Azure Service Bus can protect the application from temporary peaks.

Azure Event Hubs

Azure Event Hubs is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.

The following scenarios are some of the scenarios where you can use Event Hubs:

  • Anomaly detection (fraud/outliers)
  • Application logging
  • Analytics pipelines, such as clickstreams
  • Live dashboarding
  • Archiving data
  • Transaction processing
  • User telemetry processing
  • Device telemetry streaming

Event Hubs provides a distributed stream processing platform with low latency and seamless integration, with data and analytics services inside and outside Azure to build your complete big data pipeline.

Event Hubs represents the “front door” for an event pipeline, often called an event ingestor in solution architectures. An event ingestor is a component or service that sits between event publishers and event consumers to decouple the production of an event stream from the consumption of those events. Event Hubs provides a unified streaming platform with time retention buffer, decoupling event producers from event consumers.

Azure Logic Apps

Azure Logic Apps is a cloud service that helps you schedule, automate, and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations. Logic Apps simplifies how you design and build scalable solutions for app integration, data integration, system integration, enterprise application integration (EAI), and business-to-business (B2B) communication, whether in the cloud, on premises, or both.

Every logic app workflow starts with a trigger, which fires when a specific event happens, or when new available data meets specific criteria. Many triggers provided by the connectors in Logic Apps include basic scheduling capabilities so that you can set up how regularly your workloads run.

In many ways Azure Logic Apps and Power Automate Flows have a lot in common. However; Power Automate Flows can be packaged as part of a CDS solution. And Power Automate CDS connector has more capabilities. Plus Power Automate allows UI Automation.

Azure Functions

Azure Functions allows you to run small pieces of code (called “functions”) without worrying about application infrastructure. With Azure Functions, the cloud infrastructure provides all the up-to-date servers you need to keep your application running at scale.

A function is “triggered” by a specific type of event. Supported triggers include responding to changes in data, responding to messages, running on a schedule, or as the result of an HTTP request.

While you can always code directly against a myriad of services, integrating with other services is streamlined by using bindings. Bindings give you declarative access to a wide variety of Azure and third-party services.

Azure functions are serverless applications.

Choosing between Power Automate, Azure Logic Apps, Azure Functions and Azure App Service Webjobs maybe confusing! As all of these solve integration problems and automate business processes. This link may help you compare these options!

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Validate the solution design

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover everything that might fall under the heading “validate the solution design”.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. For the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!

The Solution Architect will work with the Quality Assurance (QA) team to ensure testing includes all parts of the architecture. This could include functional testing but may also include other types of testing such as disaster recovery and performance testing.

Proper testing is essential to ensure project success. The Architect is often one of the key people who knows the solution the best and therefore can guide the test team on how to validate it. Testing shouldn’t be considered something that just happens towards the end of a project, testing must be an ongoing effort from the first component built until go live. It is now a one-time big exercise it is an iterative, repetitive task that happens thru out the project life cycle.

Evaluate detail designs and implementation

By the implementation (build) phase the Solution Architect has set the path the implementation team will follow. By this point the Architect will have created a high level solution design document (SDD) and will have reviewed any associated detailed designs. Meaning that during implementation the role of the Architect will shift to be a supporting one for the Project Manager and Delivery Architect. As they will be responsible in ensuring the developers create the solution as defined. This includes facilitating reviews with the team to ensure implementation is meeting the architecture as well as reviews with the customer to ensure the solution is meeting their stated requirements.

During the build problems will happen! Therefore the Solution Architect is also involved in problem solving as they are often one of the few people who understand all the “moving parts” involved in the solution. Some of those “problems” maybe found by the testing team and the multiple types of testing they may complete.

There are many types of testing, I will highlight some of the common types below;

Test TypeDetails
Units TestsTypically the unit tests will be created and run by the application builder or developer. Unit tests confirm the correct operation of each component in isolation.

It will be a common practice for everyone to check their own work before handing off. Manual testing maybe completed for all apps, business rules and plug-ins. (etc.) Some tests can be automated using Power Apps Test Studio and Visual Studio.

Functional TestsFunctional testing verifies that the implementation meets the requirements.
Acceptance TestsAcceptance testing is completed by the end users. Although I have often seen test teams support the business in this task. The purpose / outcome of an acceptance test is the formal approval that the solution is fit for purpose.
Regression TestsTests completed on non-changed functions. Regression tests typically aim to prove that nothing has been “broken by accident”.
Integration TestsVerification that everything works together. Including various internal systems and any integrated 3rd party services.

The Solution Architect maybe called upon to help the test team understand how to test integrated components.

External systems will need to be available for testing. Sometimes they may require careful preparation to ensure real data is not impacted. For example, email integration testing must ensure messages aren’t inadvertently sent to real customers!

Performance TestsConfirmation that the system can cope with expected peak loads. (And maybe “slightly” beyond the stated peak loads. Can the system cope with future growth??)

You may need to define performance “hotspots” to test. These maybe areas of the system known to be complex or those where the business has expressed specific performance expectations. Additionally requirements gathering might need to establish what peak volumes look like. Occasionally you may even have contractual obligations to test, it maybe that the contract specifies the system must support “n” users or cope with “y” transactions etc etc.

You may need to include the infrastructure as part of performance testing. For example, network traffic at remote offices may need to be measured including network latency and bandwidth.

Migration TestsPractice data migrations to ensure data quality.
Disaster Recovery TestsHaving a disaster recovery plan is great but it is useless unless you can test it works.
Go Live TestsIt is common to complete multiple “dry runs” of the go live process. This might involve data migrations tests!

Note:
Performance testing may make use of Azure App Insights

Azure Application Insights, a feature of Azure Monitor, is an extensible Application Performance Management (APM) service for developers and DevOps professionals. Use it to monitor your live applications. It will automatically detect performance anomalies, and includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your app. It’s designed to help you continuously improve performance and usability. You can find out more about Azure App Insights here.

Validate security

Having collected your security requirements, designed a solution and built it …. You will need to test that security is being achieved in the required manner.

In terms of the Power Platform application this can require a significant testing effort. As test scripts will be required that must be repeatedly run against each user persona. (or set of security roles if you like.) This effort should not be under estimated.

We should also consider security beyond the Power Platform. For example, we might be using SharePoint for document management integrated in with our solution. Meaning access to SharePoint may need to be tested independently of our Power Platform solution.


MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Support Go-Live

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover Support Go-Live.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. For the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!


In an ideal world there wouldn’t be a support need for a perfectly architected solution! The reality is support is always needed after an implementation. The Solution Architect should be an active participant in the planning of post go-live support and the transition from “project mode” into business as usual.

Post go-live we would typically expect the role of the Solution Architect to be diminished. As on a day-to-day basis the Architect is unlikely to have an involvement. Although often I’ve seen a heighted period of support immediately after the initial production release, I think we can reasonable expect the Solution Architect to remain closely involved during this early period. Often this will involve helping progress any bugs or minor enhancements required post go-live. The Architect may also need to review various reports on system usage, network telemetry, storage capacity and much more. This will help them understand that the solution is being used as expected and is also performing in an optimal way.

This continued involvement of the Solution Architect post go-live helps ensure the team stays engaged as they need to stay involved until the “very end” to ensure success.

Microsoft Release Process

In addition to considering the customer’s release / deployment approach the Solution Architect will need to be aware of Microsoft’s release cadence /approach.

There are different types of releases that Microsoft might make to the Power Platform and Dynamics 365.

  • Quick Fix Engineering (QFE) / Patches– in response to major issues or security risks Microsoft can make deployments worldwide within hours.
  • Weekly patches– bug fixes, performance enhancements, stability improvements (etc) are pushed out on a weekly basis. These weekly updates could include new admin features or user facing customizations which you must opt into. But they wouldn’t typically include changes which are user facing by default.
  • Twice annual release waves– Starting in April and October major changes which are user facing will be include in release waves. The release notes for these updates are published four months in advance of the wave starting. And you can “test drive” the changes two months prior to the release.

In line with the annual releases Microsoft publish roadmap details in release plan documents. The Solution Architect should be aware of what is on the roadmap that could impact the solution architecture. It maybe new features should be evaluated and decisions taken on when best to adopt the new capabilities. Often Microsoft release new features in a preview state, when this happens it maybe useful to evaluate the feature but until the full release version is available it is not recommended preview features are used for production purposes. (As they are likely to be changed or removed!)

Additionally, importantly along with each release wave Microsoft may announce feature deprecations. Typically when this happens the feature in question is still available but will no longer receive any updates and will potentially be completely removed from the Power Platform / Dynamics 365 in the fullness of time. The Solution Architect should therefore confirm no deprecated features are being used and when in use plan for their replacement.

Finally the Solution Architect should be mindful not to include any unsupported customizations. As doing this will ensure the fast paced updates from Microsoft do not adversely impact the solution.

Another topic the Solution Architect should monitor is how the customer’s licenses maybe impacted by feature deprecations or the bi-annual release waves. As it is not uncommon for new features to need new licenses!

Facilitate risk reduction of performance items

Often the activities involved in testing and go live preparation will come towards the end of the project life cycle but I suggest it might be important to understand that testing and readiness does not have to wait until the end of a project. Continuous testing can be used to keep all stake holders informed on solution quality and readiness, this can in turn improve your chance of success and lead to a smoother handover from “project mode” into business as usual.

Additionally post go-live it will be important to monitor the performance of applications.

For example reviewing the performance of your canvas apps should be an ongoing effort, you may (for example) be able to off load “work” completed in the canvas app into a Power Automate. And therefore create a more responsive UI. You may be able to use test studio to create repeatable tests for your canvas apps. This will allow regression testing and become an important part of your software development life cycle (SDLC). Test studio is a pretty new tool to help validate that canvas apps work as expected, you can read about it here.

The Solution Architect will help the customer understand the readiness of the system and any associated risks of known issues. In doing this they will help guide the customer towards an informed go live decision. Often the Solution Architect will know the system better than anyone so an informed risk assessment of the systems readiness is something the Architect is well placed to perform. They should know better than anyone what could break, what might not work as designed and what actions can be taken if the system goes down.

Troubleshoot data migration

During the testing phase, data migration should have been tested and the resulting data checked for completeness and accuracy. Repeating these tests multiple times should help ensure the final go-live migration operates as smoothly as possible.

With large implementations it may not be physically possible to migrate all of the historic data on day one. Therefore data migration plans maybe needed to prioritize which data gets imported first.

Review and advise in deployment planning

The Solution Architect will typically be part of the delivery team that will have been put in place to build and test the solution. It will be the Architects role to help build this team and validate any development and testing plans. During the implementation phase the solution architect will become key in triaging any development problems.

Creating a deployment plan is essential in ensuring all go live steps are completed and sequenced correctly. The Solution Architect may not be the person who actually creates the deployment plan but the Architect should expect to provide input into the plan and act as a reviewer.

The architect should whenever possible support the deployment process by identifying risks and help form a “plan B”. The Solution Architect is the “champion of the solution” and may often be the first port of call when the customer is unhappy with progress during deployment.

Some common go live problems (to be avoided) may include;

  • No roll back plan– if the deployment goes bad what are we going to do? Can we roll back? (And has that roll back been tested?)
  • Not enough testing– testing maybe very comprehensive but have you done enough real-world testing? Do the customizations really meet the users needs and does the system perform as expected with real user / data volumes?
  • Outstanding defects not evaluated– perfection is rare but have any outstanding issues been properly evaluated. Is the impact of fixing now compared to post go live fully understood?
  • Incorrect environment assumptions– avoid incorrect assumptions about end user workstations or network configurations.

The Architect should review deployment plans and seek out ways to streamline and simplify the plan. For example, can some of the data be migrated early or can we pre-deploy our mobile apps etc.

Additionally have all user logins been created, appropriate licenses assigned and do they all have the correct roles in the Dataverse (CDS). If possible get all users to access a dummy productions environment to help spot any issues early.

It may be the case that you have to deploy the new application as a “big bang”. But if possible consider if the old system can be run in parallel to the new. Meaning can we slowly move users over to the new system in a controlled manner. Or if a “big bang” is the only option then the Solution Architect may need to advise on the level of risk involved in this process.

Automate the go live

Consider how much of the go live deployment process can be automated. As an automated process is less likely to fail and is also more testable prior to the actual go live.

You may, for example be able to automate the creation is users, teams, business units. Or you may be able to automate the import / creation of reference data. (Microsoft’s configuration migration tool may assist with this.)

But if you do use any automations during the go live process take care to ensure they are well tested.

With or without automations it is essential to have a go-live check list and a timeline. Plus test this check list! Does it include everything and are the timings achievable / realistic. And have we allowed sufficient contingency for any issues that could crop up.

Review and advise with Go-Live Assessment

It may be true that the Solution Architect will be involved in any go / no-go decisions.

Often we may envisage that a solution will be perfect by the time of go-live. In reality this is rarely the case! It will be likely that the Solution Architect may need to help communicate details of any known issues that will be present at go-live. The Architect maybe called upon to evaluate the impact of fixing any outstanding issues now or once live. (In my experience fixing some issues post go live can be much more costly.)

Documentation at go-live maybe critical to ensure decisions, agreements, tasks and assumptions aren’t forgotten. A quick agreement reached in the hallway can quickly be forgotten! Any consents reached with the customer should be documented.

Plus any suggestions / recommendations made should be recorded, especially if the customer rejects a recommendation. You want to avoid someone incorrectly saying afterwards “why didn’t you tell us that??”.

Once the system is live the Solution Architect is often the first person who gets called when problems arise. They can help triage problems and isolate the root cause. In resolving go-live issues the Architect must consider the immediate impacts as well as the longer term. Can we mitigate the issue now with a “quick fix” whilst also considering a longer term solution to avoid future reoccurrences?

Omnichannel for Customer Service – WhatsApp

$
0
0

Microsoft’s Omnichannel for Customer Service supports the WhatsApp channel, in this post I will explain how I configured this to work in my environment.

WhatsApp is obviously a widely adopted social channel with many customers favouring engaging with businesses using WhatsApp. The WhatsApp feature of Omnichannel for Customer service supports WhatsApp integration via Twilio.

You might need to appreciate WhatsApp message types and the 24 hour rule … as this is slightly different to some social platforms. We have two types of WhatsApp messages. Template messages and Session messages.

Template messages are outbound messages, these are transactional messages with a pre-approved format. (For example: “Your order has been dispatched”.) They can only be sent to users who have opted to receive messages from your organization. In this post I am going to concentrate on incoming messages! I might return to more detail around outbound / transactional messages in a later post. (As we can now send outbound SMS and WhatsApp messages!)

Session messages are messages that are incoming messages from a customer. The session would include the incoming messages and any subsequent outgoing replies. Sessions last for a maximum of 24 hours from the most recently received incoming message. Therefore WhatsApp conversations, unlike say SMS conversations only last for 24 hours and cannot continue over several days (or weeks) as could happen with SMS. Outside of the 24 hours, for an agent to re-engage with a customer they’d need to send a template message.

Prerequisites

As WhatsApp integration is delivered using Twilio we first need a Twilio account and number. I explain how to setup a Twilio account and configure the SMS channel in this post.

Note: Currently Microsoft only support US Twilio phone numbers.

Now before you think about configuring the WhatsApp channel it makes sense to check that you are on the latest version of Omnichannel. Below you can see that in the Dynamics 365 administration center I checked that my sandbox was up to date. It wasn’t! So my first step was to apply the upgrade. Applying an upgrade can be a lengthy process. So whilst that happened I had plenty of time to read up on the install process. You can find Microsoft’s documentation on the WhatsApp channel here.

WhatsApp channel Setup steps ….

  1. Connect Twilio number to WhatsApp
  2. Copy Twilio Details
  3. Create WhatsApp Channel
  4. Add WhatsApp number into Omnichannel
  5. Validate your setup

Step One – Connect Twilio Number to WhatsApp

If you haven’t done so already your first step will be to connect your Twilio account to WhatsApp. The steps involved in connecting Twilio to WhatsApp are;

  1. Request access to enable your Twilio numbers for WhatsApp
  2. Submit a WhatsApp sender request to the Twilio console
  3. Approve Twilio messages on your behalf in the Facebook Business Manager console.
  4. Submit your Facebook Business Manager account for Business Verification
  5. Twilio completes your WhatsApp sender registration.

Simples!! Confused? You can read about this process in detail here.

Note:
You will need a Facebook Business Manager ID to be able to request authorizing Twilio/ If you don’t have a Facebook business manager account already you can use this link to create one!

I won’t describe each of the above steps in massive detail. (The link I have provided does that!) But I will give some pointers.

Once I had created a Facebook business manager submitting the request to enable my numbers was simple enough. But you do have to wait for an email to arrive. That took a few days. And in my case I did have to create a support ticket with Twilio! But they did help me out pretty quickly.

After successfully completing step one and requesting your number is enabled for WhatsApp you should receive an email giving you the next steps. (Which involves starting the process to setup a WhatsApp sender in Twilio.) The steps are easy to follow! But I did notice the comment below in my pre-approval email. This isn’t a quick process!!

Once you have the pre-approval email from Twilio you are ready to enable a WhatsApp sender. For this you open the programmable messaging section in your Twilio portal. Here you will find a WhatsApp senders section. After I requested that my Twilio number be added as a WhatsApp sender I needed to wait again for approval.

Next you will get another email from Twilio. This time you need to approve messages being sent on behalf of WhatsApp. You can see below that you complete this approval in your Facebook business account.


At the same time you will need to ensure that your business is verified with Facebook.


This process of business verification and connecting Twilio can be lengthy! I really wanted to say the process was straight forward but I actually needed several support tickets with Twilio. I also ended up completing the business verification with Facebook several times. (Maybe the documents I supplied to verify my business address weren’t ideal!)

Eventually WhatsApp was correctly setup in Twilio …. you can see below that under programmable messaging my number has been approved by WhatsApp.

Step Two – Copy Twilio Details

As you set up the WhatsApp channel in Omnichannel for Customer Service you will need your Twilio Account SID and Auth token. These can be found on the first page of your dashboard in the Twilio account portal. (As shown below.)

Step Three – Create WhatsApp Channel

You are now ready to create the WhatsApp channel in Omnichannel for Customer Service. In the Omnichannel Administration app find the WhatsApp option in the channels section and use the “New” button to create a WhatsApp channel.

Toi create your WhatsApp channel you will need to give the channel a name and then enter your Twilio account SID and Auth token. (copied in previous step!)

Once you click save a Twilio inbound URL will be generated.

Now you need to return to your Twilio console. And in the WhatsApp senders option enter the Twilio inbound URL. (Your Twilio number must be approved as a WhatsApp sender before attempting this step!)

Below you can see that I have entered by Twilio inbound url into my WhatsApp sender within the Twilio portal.

Step Four – Add WhatsApp number into Omnichannel

In your WhatsApp channel you can now enter your WhatsApp phone number.

Below you can see that I’d sued the “+ New WhatsApp number” within my WhatsApp channel in the Omnichannel Administration app.

Below you can see the details for my phone number. Notice I have selected the WhatsApp workstream. (In which I will have also defined any routing rules that I needed.)

I also navigated to the general settings tba and enabled the file attachments options. These are optional but I guess you will probably want to support swapping attachments with your customers.

Step Five – Validate your setup

Your final step is to validate your setup. Below you can see that I used the validate option in the Validation section of my WhatsApp channel.

Test

Finally in my WhatsApp mobile app I created a new contact including the phone number I’d created. And when I sent a message into this account I received the expected notification within Omnichannel for Customer Service

Accepting the conversation allowed the conversation to begin in Omnichannel. And as I’d set the option to enable file attachments these could be sent within the conversation.

Conclusion

The setup process for the WhatsApp channel was not easy! But to be fair all of my challenges seemed to be connected with authorising my Twilio phone number for use with WhatsApp.

Once the process to approve my Twilio number for WhatsApp was completed making the number operate within Omnichannel for Customer Service was a pretty quick and easy process.

I can now receive inbound conversations from customers using WhatsApp. Next it is likely that I will also want to start outbound conversations with my customers. For that I will need some approved outbound WhatsApp message templates. I will dig into the process for configuring outbound WhatsApp messages in a future post!

Omnichannel for Customer Service – Outbound Messages (SMS)

$
0
0

We now have the capability in Microsoft’s Omnichannel for Customer service to commence outbound conversations with our customers using SMS, WhatsApp and Twitter. In this post I will give an example of how we’d configure an outbound SMS. (In later posts I plan to cover WhatsApp and Twitter.)

You can read all about the outbound capabilities in Microsoft’s documentation here.

The overall concept is that we create an outbound message template, enable outbound messaging on a given channel and then create a Power Automate Flow that will send the message. Plus the message template can include dynamics values which can be provided by the Flow, an approach that allows us to personalize the outbound message.

When describing a feature I tend to like to show an actual example of how I’ve configured it. The Microsoft documentation describes a common scenario of sending a message when a case is created or resolved. They even include example Power Automates that you can download and amend. These will be common uses of outbound messages and I do suggest you review their examples.

In the interest of showing something different I have decided to use a different example … this will be based on a proof of concept I created for a real scenario!

My use case was a little different! I wanted to generate an SMS to someone when a conversation request arrived into Omnichannel. Why??? …. Well in my scenario we wanted to alert someone with that a fresh conversation had started. Maybe you have someone on out of hours support who needs to monitor incoming conversations on a particular channel. If the volumes are low we can’t expect them to be glued to the screen 24/7, so having a “nudge” from an SMS might be useful.

This does mean that my example is a little more specific than those mentioned in the Microsoft documentation. But actually the steps involved are pretty much the same. Those being;

  1. Create a message template
  2. Configure outbound messaging
  3. Create a Power Automate to send the message

Tip: If a customer receives an outbound message. If / when they reply to that message they’d effectively commence an inbound conversation with one of your agents. Imagine, for example, that you send them an outbound message saying “Your delivery will be at 10am tomorrow”. They could reply explaining they wouldn’t be at home tomorrow etc. So my tip is, when considering outbound messages you might want to consider the full customer journey. As the outbound message may just be the start of the conversation with your customer.

Step One – Create a message template

Below you can see that I have opened my Omnichannel Administration app and within this located the messaging templates option. Within here you will see any existing templates, which you can amend. Or you can use the “+ New” option to create new templates.


Below you can see my example message template. It is a pretty simple message!

All I have done is given my template a name and picked the channel. SMS, in this example.

Next I enter the text for my message. Notice that I have entered “{Myname}”. Anything between the brackets will be substituted for dynamic information by my Power Automate. We’ll see how that works later in this post!

Tip:
My example is pretty simple in that I am only going to send the message in English. Notice that I could create multiple versions of the message with localised text if I wanted to implement to multiple languages.

Step Two – Configure Outbound Messaging

Now I have my outbound template I am ready to configure my outbound channel to use that template. Below you can see that in the Omnichannel Administration app I have located the “Outbound” option. Within this we can see my existing outbound configurations. You can edit those from here or use the “+ New” option to create a new one.

I actually have two providers of SMS defined! Telesign and Twilio. So in my example I have created both as I wanted to test working with either of my two SMS numbers. But each of these will leverage the same outbound template. Hopefully you can see that this gives you flexibility if you happen to have multiple SMS numbers defined. (Or WhatsApp and Twitter accounts if you wish to send outbound messages using those channels.

Below I have shown one of my outbound configurations.

Notice the configuration ID. This is generated when I save the configuration. You will need to copy this as we’ll use it within the next step of creating a Power Automate Flow to send the outbound messages.

Also notice that I have selected the SMS channel and also picked which channel I want to use. Plus I have linked this configuration to the message template I created in the first step.

Importantly notice the “show in timeline” option. In my scenario I just wanted to send out the message and didn’t need it to show in the timeline of any related record. So I entered “No”.

In other scenarios you will want to link the outbound message to an associated case or other entity. If you need this enter “Yes”. But having done that you will also need to provide details of the regarding “object” in the Power Automate that we will create in the next step.

Step Three – Create a Power Automate to send the message

Now my Outbound message has been defined I needed to create a Power Automate to send the message. You could trigger the message in many ways … for example when a case is created. But in my example I will trigger the Power Auotmate when a conversation in Omnichannel is created.

I have shown the steps in my Power Automate below. I will then expand on each of these below. At a high level the actions in my Power Automate are as follows;

  1. Conversation (Created)…. As explained my Power Automate is triggered when a conversation is created.
  2. Initialize (ContactList)…. Firstly I initialize an array. At this point it will be blank but later we will add the contact you will receive this message.
  3. Get (NeilParkhurst Contact)… I added this step to query the contact I wanted to send the message. (If you has a case maybe you’d read details for the customer associated to the case!)
  4. Append to array variable… Having queried my contact I add details for the message to my array.
  5. Compose (JSON)… next I format my array as JSON, as required by my next step.
  6. Unbound action (Send SMS)… My final step is to trigger an action to send the SMS.

In the following sections I will detail exactly how I defined each of my Power Automate actions.

Conversation (Created)

My first tile is pretty simple. Here I am simply saying that the Flow is triggered on create of a “Conversation”. In my final production version I may have wanted to use the advanced options to filter which conversation. Or maybe I would have followed this step with a condition to check if I wanted to send the SMS. But this was a simple proof of concept, so I simply trigger my Flow for every conversation that is created.

Initialize (ContactList)

Here I am simply initializing an array variable which will be called “ContactList”. I will use this later in my Flow.

Get(NeilParkhurst Contact)

In this next step I am running a query to get the contact that I want to send the message. (As I need to find their mobile number!)

Again keep in mind that this is a simple proof of concept example. As I have queries the contract directly from the ID of my contact. You will probably want to do something more creative to find the contact or contacts that need to receive this message.

Append to array variable

Here I am adding details of my contact into the array. Specifically I am defining the mobile phone number to use for my message.

Also notice that under context items I have set a variable called “Myname” to the first name of my contact. If you recall in step one I included “Myname” in my message template. Meaning at runtime the contacts name will be inserted into my templte.

I have shown the value for my append action below;

{
"tocontactid": "@{outputs('Get(NeilParkhurst_Contact)')?['body/mobilephone']}",
"channelid": "sms",
"optin": true,
"contextitems": {
"Myname": @{outputs('Get(NeilParkhurst_Contact)')?['body/firstname']}
  }
}

Note:
If earlier you’d selected the option to add the message into the timeline. In this contextitems section you’d need to include the “regardingobjectid” and “entityrealtionshipname” for the regarding table. If you are doing that you might want to review the example in the Microsoft documentation I referenced at the start of this post.

Compose (JSON)

The next action simply composes my array ready for use.

Unbound action (Send SMS)

My final action sends the message. You create an unbound action and enter the name “msdyn_InvokeOutboundAPI”.

Next we add the ID of your outbound configuration. (This is the ID from the outbound configuration which I commented you needed to copy in the previous step!)

You need to add the ID into a field called “msdyn_ocoutboundconfiguration msdyn_ocoutboundconfigurationid”.

Tip: I made an mistake at this point!!! As you can see from the screen shot below the names wrap and I couldn’t easily read the entire name. Therefore you need to be really careful that you assign the ID to the correct field as there are several with very similar names!

My final setting is to add the output from my “Compose (JSON)” action into the ContactList for this unbound action.

Once my Power Automate was turned on I began to receive text messages every time a new conversation arrived into Omnichannel. Honestly, this might not be the best example in the world! But I hope you can see that the overall concept is pretty straight forward and you could amend my idea for many situations.

Meaning the resulting solution is actually very flexible and pretty simple to implement. Enjoy.

MB-230: Microsoft Dynamics 365 Customer Service – Introduction

$
0
0

I am about to commence my revision for the MB-230 exam. This exam is for Microsoft Dynamics 365 and covers all aspects of customer service. As I revise I plan to publish blog posts that collectively will become a complete revision guide for anyone embarking on the same journey as me.

Who should study to MB-230

Anyone who wishes to gain the “Dynamics 365 Customer Service Functional Consultant Associate” badge will need to pass the MB 230 exam. (In addition to the Power Platform Functional Consultant exam, PL-200).

But why would you want to be a “Dynamics 365 Customer Service Functional Consultant Associate”? (Catchy name!!) Well, this badge demonstrates to employers that you have a deep understanding of the Power Platform and also specifically the functionality associated with all things customer service related.

Developers, business analysts, testers and more may find holding this badge beneficial. In fact pretty much anyone involved in the deployment of Dynamics 365 systems for customer service.

Renewals?

If you already have the MB 230 exam but your certificate is due to expire you may be eligible for a renewal! I recently had an email explaining that I needed to renew my “Dynamics 365 Customer Service Functional Consultant Associate” certificate. I’ve shown a copy of the email below, so you know what to look out for!

You will find that the renewal process includes a set of multiple choice questions that can be completed as a “simple” online survey. One major bonus being that I managed to complete this process for no cost.

If you are confident with the Dynamics 365 customer service you may feel able to dive straight in and take this online assessment. I would however suggest you pause and think before jumping in. (I passed my renewal but I honestly wish I hadn’t rushed!!)

You will almost certainly find that the questions will focus on the latest features or areas of the skills measuerd more recently added to the exam. With MB 230 in mind this might include more recently added topics like Omnichannel and more. So if you aren’t familiar with the latest features maybe some revision prior to taking the renewal test will help you??? (And anyway a bit of revision on your favourite subject won’t hurt you!)

What the exam covers

I will give you a break down of the specific skills measured later in this post but initially let’s consider at a high level what knowledge this exam will require you to possess.

Firstly you will need to understand quite a bit around cases. This isn’t a surprise as the case is the primary entity in customer service. You will need to understand everything connected with the management of cases including things like record creation rules, service level agreements, entitlements and much more.

Next you will need to be aware of how creating a knowledge base might help in customer service scenarios.

Queuing work is an important concept in many customer service processes. So you will need to understand how to define queues and how users might interact with queues.

Scheduling work is also a common challenge, so you will obviously need to understand how to define resources and schedule work for them.

Talking with the customer is important! This is where Omnichannel comes into play. Meaning you will need to understand how to define various digital communication channels and how you route the right conversations to the correct agents.

And finally you will need to understand what reporting / analytic capabilities are available to allow organisations to analyse their performance and identify areas which can be improved.

Exam preparation tips

Get a trial… you can sign up for a free 30 day trial at “trials.dynamics.com”. In my opinion having a trial instance to ensure you get plenty of hands on practice is essential. You can learn a lot by reading blogs and documentation about Dynamics 365 and the Power Platform. But when it comes down to it there is no substitute for getting your hands dirty. Use the product as much as possible as part of your revision.

Plan … you may have taken multiple Microsoft certifications before. Maybe you have even taken one of the previous Customer Service certifications. (For example, MB2-718.) This means you may feel you already know many of the skills measured in great detail. But you should still try to plan your revision carefully. Dynamics 365 is a massive product and you may find yourself being tempted into reviewing many topics outside the scope of the MB-230 exam. This is why I always start my revision process by printing off the skills measured statement. I then highlight all of the topics I know I need to review. This process allows me to be clear on what I need to learn. I can create a list of revision topics and tick them off as completed. Meaning I ensure I cover everything required and also I avoid “wasting time” reading irrelevant material.

Don’t rely on one source of information! … Hopefully this series of blog posts will prove useful. But I strongly encourage you to not rely on it! There are many blogs out there which are written by Dynamics 365 fans, Microsoft Certified Trainers and MVPs. Seek out these blogs and read as many as possible. Also, read and re-read as much information as you can from docs.microsoft.com … reading some of the content can occasionally be hard work! But there is no better source for information about Dynamics 365!

https://docs.microsoft.com/en-us/dynamics365/customer-service/

Skills Measured

So, as promised earlier, I have given a breakdown of the skills measured below.

Again I encourage you to look at the Microsoft content! As they do update the skills measured from time to time. And depending on when you are reading this blog post additional items may have been included! Or some old subjects removed.

https://docs.microsoft.com/en-us/learn/certifications/exams/mb-230

Manage cases and Knowledge Management
Create and manage cases· configure cases

· manage case lists

· create and search for case records

· convert activities to cases

· perform case resolution

· implement parent/child cases

· merge cases

· set autonumbering for customer service entities

Configure and automate cases· implement Advanced Similarity rules

· implement record creation and update rules

· implement case routing rules

· customize the Case Resolution form

· configure Status Reason transitions

· configure business process flows

· capture customer feedback by using Customer Voice

Implement Knowledge Management · configure the Knowledge Search control

· link an article with a case

· use Knowledge Management to resolve cases

· manage the Knowledge Management article lifecycle

· manage Knowledge management articles

· configure entities for Knowledge Management

· manage Knowledge article templates

· implement Knowledge Search

· enable Relevance Search

· configure categories and subjects

· convert cases to knowledge articles

Manage queues, entitlements, and service-level agreements
Create and manage queues · describe use cases for each queue type

· configure queues

· add cases and activities to queues

· configure entities for queues

· perform queue operations

Create and manage entitlements · configure entitlements

· define and create entitlements

· manage entitlement templates

· activate and deactivate entitlements

· renew or cancel an entitlement

Create and manage SLAs · define and create service-level agreements (SLAs)

· configure SLA settings

· configure a holiday schedule

· configure a customer service schedule

· implement actions by using Power Automate

· manage cases that are associated with SLAs

· manually apply an SLA

· create and manage SLA items

Implement scheduling
Manage resources · configure business closures

· configure organizational units

· configure resources

· configure work hours

· configure facilities and equipment

Manage Services· define services

· schedule a service activity

· configure fulfilment preferences

· create a schedule board

· schedule a service activity by using the schedule board

Implement Omnichannel for Customer Service
Deploy Omnichannel for Customer Service
· provision Omnichannel for Customer Service

· define user settings

· configure application setting

· manage queues

· configure skills-based routing

Implement Power Virtual Agents
· describe Power Virtual Agents components and concepts

· integrate Power Virtual Agents with Dynamics 365 Customer Service

· escalate conversations to a live agent

Manage channels · describe use cases for the Channel Integration Framework

· configure channels

· enable the chat widget on websites

· configure pre-chat surveys

· configure proactive chat

· configure Secure Message Service (SMS)

Distribute work
· describe difference between entity routing and channel routing

· configure work streams

· configure entity routing

· configure routing values

· implement context variables

Configure the agent experience
· create macros

· define agent scripts

· configure Quick Responses

· configure sessions and applications

· configure notifications

Configure the supervisor experience
· configure Omnichannel Insights dashboard

· configure intraday insights

· customize KPIs for intraday insights

· enable sentiment analysis

Manage analytics
Configure Customer Service Insights· describe capabilities and use cases for Customer Service Insights dashboards

· connect to Customer Service Insights

· manage workspaces

Create and configure visualizations
· configure interactive dashboards

· design and create charts

· design reports by using the Design wizard

Hopefully this post has given you a good over of the MB 230 exam. In future posts I will complete a deep dive into each of the skills measured mentioned above with an aim of creating a pretty comprehensive revision guide.

MB-230: Microsoft Dynamics 365 Customer Service – Customer Service Overview

$
0
0

I am currently revising for the MB-230 exam. This exam is for Microsoft Dynamics 365 and covers all aspects of customer service. As I revise I plan to publish blog posts that collectively will become a complete revision guide for anyone embarking on the same journey as me. In this post I will give an overview of some of the basic concepts we can expect to find in customer service scenarios.

In my opinion, customer service is a critical element of Dynamics 365. I hold this opinion as all good customer relationship management (CRM) systems should include a heavy focus on customer service as they should always put the needs of the customer first. But why is this true?? The answer is simple … giving excellent customer service leads to a happy customer. It should be obvious why we want happy customers! Unhappy customers are not good candidates for repeat business and more importantly they are also likely to become social media detractors. Customers will often publicly comment on the support they receive! I would suggest that it is essential for every business to ensure their customers become fans who are “promoters” rather than “detractors”.

Giving good customer service involves having well defined business processes, excellent information and a consistent repeatable approach. Agents need to gain access to key information in a timely manner and have solutions available to them to quickly respond to customer requests. Dynamics 365 helps companies achieve this critical aim. As you review these revision notes I hope you can clearly see how cases, knowledge base, SLAs, scheduling, omnichannel and much more all come together to provide a set of tools to manage the critical business processes involved in customer service.

You may have seen diagrams similar to the one shown below many times before! It splits the Dynamics 365 into the applications that make up the entire solution. I would suggest that customer service influences most of these modules. However commonly customer service will be directly connected with two applications, those being the “Customer Service” and “Field Service”. The Customer Service application provides facilities such as case management and an integrated knowledge base to handle customer requests. The Field Service application is closely related but has a focus on providing service in the field. Imagine your company supplies air conditioning units. It would be likely that you’d have a number of engineers who would visit customer sites to install, service and repair your air conditioning systems. It is in scenarios like this that the Field Service application comes into play.

The MB 230 exam will have a focus on the core customer service application. We have another exam which specifically covers Field Service scenarios. (MB 240.) Meaning you might need to be aware that the Field Service application exists but I’m not expecting to need to revise any Field Service topics as part of my MB 230 preparation.

It may also be important to remind ourselves that Dynamics 365 is “simply” a set of applications that are built on the Power Platform and make use of the Dataverse. (aka the database behind the Power Platform.)

Power Platform applications are made up of Power BI (Business analytics), Power Apps (application development), Power Automate (process automation) and Power Virtual Agents (intelligent virtual agents). Dynamics 365 Customer service is therefore a model driven Power App that may make use of all of the elements of the Power Platform and you can extend as required using the tools provided within the Power Platform.

This approach creates a powerful and extensible customer service application!

Before “diving into” the specifics of various aspects of the Dynamics 365 Customer Service it’s important to gain an understanding of key service principles, entities and any associated terminology. If you have been working with Dynamics 365 for any length of time these concepts should be pretty familiar but a bit of revision never hurt anyone!

The service module within Dynamics 365 can be used to address multiple customer scenarios. Including customers reporting faults, scheduling service visits or resolving issues. Or alternatively a customer may simply phone for some advice or guidance, in this scenario the use of the knowledge base may be essential. These types of scenario can be supported by business process flows which help guide the operators, ensuring a consistent approach.

It is important to understand the key record types and concepts applicable to servicing with Microsoft Dynamics 365. Some of these are covered at a high level in the table below. The concepts mentioned here will be expanded upon in my future posts.

Term / Record typeDescription
Customer RecordsWhen the term customer is used this can refer to the “contact” or “account” that require service. In several situations you will find one field called “customer” that can link to either a contact or an account.

Note: In the context of service the customer field never links to a Lead.

CasesThe case is the key record type in service management, each case represents a single incident of a service request. (In fact the term incident is often used to refer to a case, you should therefore consider these as interchangeable terms.)

Companies may create cases for a multitude of reason, including handling a complaint, logging a service request or even simply recording the fact that a customer has a question. As we will see in later posts, each case has a status. As cases are progressed they change from an open status to resolved.

Work OrdersLike cases work orders are requests for service. However in Dynamics 365 when we use the term work order we’ll be referring to a service that needs to be completed in the field. (Therefore within the Field Service application.)
ActivitiesInteractions between a business and their customers, typically including phone calls, emails, tasks, letters and appointments.

Although beyond these out of the box activities other bespoke activity types can be created to reflect additional scenarios unique to a particular organization. It will be common for companies to use activities to log all activities with a customer regarding a case.

Resolution ActivitiesResolution activities are a “special” activity type regarding the resolution of cases, these are created when cases are resolved. And can later be used for reporting on service effectiveness.

Note: One case could, in theory, have multiple resolution activities. As it may be resolved, re-opened and resolved again.

PostsCase resolution activities should not be confused with posts!

Posts are another concept which can be used to record key events on a table. In the example of cases this might involve the case being created, resolved and more.

Knowledge Base ArticlesA knowledge base is a repository of informational articles used to help resolve cases. These may be purely used internally by the customer service representatives or also shared externally. By emailing articles to customers or possibly by the article being made public via a customer service portal.
EntitlementsUsed to specify the amount of support services a customer is entitled to, for example a new customer maybe entitled to 8 hours of phone support within the first three months of a contract. Entitlements include “entitlement channels” that further define how much support can be given by specific channels.

Entitlements may also define which service level agreement (SLA) to apply to which customer.

Service Level Agreements (SLA)A service level agreement defines level of service that your organization agrees to offer a customer. This maybe for the time to respond to a service request and / or the time to resolve a request.
QueuesQueues are essentially lists, they offer a place to organize and store activities and cases waiting to be processed.
Subject TreeHierarchical list of subjects an organization can use to classify service cases and other records
Products Products are defined in the product catalogue and can be used to provides a detailed view of service events at a product level

Before we dive deeper, lets quickly look at some key areas that make up the service offerings within Microsoft Dynamics 365;

ItemDescription
Case ManagementAs already mentioned cases are the key entity used for servicing a customer, their basic usage is pretty straight forward but you will need to have an appreciation of several related topics. Including entitlements, service level and agreements.
Knowledge BaseThe knowledge base supports the creation of articles that maybe used to help resolve cases.
Customer Service Hub (CSH)The term customer service hub refers to the Dynamics 365 app used to access all of the key features found in customer service module.

Commonly this will be the app that customer service agents use the most.

Note: If you happen to see the phrase “Interactive Service Hub” (ISH) then this will be referring to a comparable app we found in earlier versions of Dynamics 365.

Customer Service WorkspaceCustomer service workspace is a relatively new customer service app. Like the customer service hub this provides all of the key features needed by customer service agents. But in addition it provides an interface similar to that found in “Omnichannel for Customer Service”. (More on that later!)
Unified Service DeskUnified Service Desk (USD) is a tool which brings together multiple technologies into a single interface. You can use USD to create a “solution” which can help simplify the processes followed within high volume contact centres. (Regular readers of my blog might know I often write about this!)

Note: USD is aimed at scenarios often found in contact centres. Contact centre agents also now have the Omnichannel for customer service app.

Tip: From an MB 230 point of view, you may need to be aware of USD but I have not seen any specific references to USD in the skills measured statement. Therefore I am not expecting any detailed questions around USD.

Omnichannel for Customer ServiceOmnichannel for customer service is an app designed for agents working in a service contact centre. It provides a multi-session / multi-tab interface which with additional productivity enhancements allows agents to quickly handle customer queries.

In addition conversations from digital channels are routed to agents using Omnichannel for Customer Service. These channels can be very varies including web chat, Facebook messenger, Twitter and many more.

As mentioned above “Customer Service Workspace” and “Omnichannel for Customer Service” share the same interface approach.

Field ServiceField Service is an application that helps organizations manage their field engineers. It adds many capabilities including work order creation and management, work order scheduling, mobile access for field agents, inventory management etc.
Reporting / InsightsMicrosoft Dynamics 365 supports an organisations reporting requirements by making use of dashboards, goals and Power BI.
Customer VoiceCustomer surveys are possible using the Customer Voice feature. This can be used to help obtain metrics on customer satisfaction levels following incidents of service.
SchedulingThe customer service app includes the ability to route work to appropriate resources. This concept supports definition of working hours, resource skills and many more concepts all of which work together to create a sophisticated scheduling tool.

The scheduling is completed on a schedule board which gives a rich graphical representation of available agents and assigned “jobs”.

Power Virtual AgentsPower virtual agents can be used in conjunction with Omnichannel for Customer Service. You might use a virtual agent to answer questions from a customer and if / when the virtual agent is unable to assist the customer it can perform a hand off from the virtual agent to a human.

Hopefully this post has given you a very quick introduction into the concepts needed for the customer service certification (MB 230) In later posts we will dive deeper into many of these concepts.


MB-230: Microsoft Dynamics 365 Customer Service – Case Management (Part One)

$
0
0

I am currently revising for the MB-230 exam. This exam is for Microsoft Dynamics 365 and covers all aspects of customer service. As I revise I plan to publish blog posts that collectively will become a complete revision guide for anyone embarking on the same journey as me. In this post I will begin to describe the important topic of case management.

Part of the skills measured statement relating to case management is shown below. From this we can see that case management covers a wide variety of capabilities;


We need to cover quite a bit of ground! In this post I will include;

  • An over of Cases
  • Case Views
  • Searching Cases
  • Creating Cases from Activities

In a later post I will cover other topic including;

  • Resolving Cases
  • Parent / Child Cases
  • Merge Cases
  • And more!

As already mentioned cases are a fundamental part of the customer service functionality in Dynamics 365, therefore I suggest you spend a significant amount of your exam preparation time creating cases and then experimenting on the various ways to route, resolve, cancel, merge etc etc.

Cases An Overview

Cases (also known as incidents) are the most fundamental entity in the service module of Dynamics 365, it will therefore be essential that you have a detailed understanding of their management. If you are familiar with Dynamics 365, you may find much of this post covers concepts that you are already familiar with but if you are preparing for the MB-230 exam a little revision won’t hurt.

Each case represents a single incident of support, some companies may refer to a case as a ticket, service request etc. Each customer can have multiple open cases at any moment in time. Cases can have subjects, knowledge base articles, products and / or entitlements related to them. Cases also have activities associated with them. And cases can originate from activities, such as an email support query.

Cases can be created directly from the main case form, a quick create form or by converting activities. (Activities can include, appointments, emails, phone calls, faxes, letters, service activities, campaign responses, tasks and even custom activities.) It is also possible to use record creation rules to automate creation of cases, for example on receipt of an email.

Out of the box the customer (account or contact) and case title are mandatory fields on a case.

Cases, by default, utilize a business process flow to display stage information. The default stages are identify, research and resolve. (We’ll look at this business process flow in more detail later in this post.)

Over and above the typical actions available from the ribbon bar we have multiple actions that are specific to cases. Case specific actions include;

  • Save & route
  • Create new case
  • Create child case
  • Resolve Case
  • Reactive Case (If viewing a resolve case)
  • Cancel case
  • Adding to a queue or viewing current queue item detail
  • Do not decrement entitlements
  • Convert to work order (assuming Field Service is installed)

Note: Some of these actions may be self-explanatory. But I will return to some options in later posts. For example, to appreciate the reasons for not decrementing entitlements you will first need to understand entitlements.

Tip:
I strongly recommend that your revision includes testing out all of these case specific actions. (Especially any options you are not familiar with or ones you haven’t used recently.)

Additionally you will find many options common to all tables in Dynamics 365 including things like;

  • Save & Close
  • Assign
  • Follow / unfollow
  • Delete
  • Email link
  • Share
  • Word Templates
  • etc.

Out of the box tabs on the case form include;

  • Summary (Case title, ID, subject, customer, timeline etc.)
  • Details( Type of case, parent case, escalation details etc.)
  • Case Relationships (merged cases, child cases and associated knowledge articles)
  • SLA

Tip: It might be worth remembering that the case form can be customized. Meaning your form could include slightly different tabs to the ones mentioned above. What will be important is understanding all of the concepts with each attribute on the form. Rather than its position!

In the center of the “standard” case form the timeline gives users access to posts, activities and notes regarding the case. This becomes a very important control to visualize the efforts completed towards resolving the case. We can also create activities and notes directly from the timeline.

Tip: The timeline is an important concept! Therefore as part of your revision I suggest you experiment by creating activities and notes directly from the timeline. Plus ensure you are familiar with how we can filter and search the timeline.


Case Views

Views are lists of data and are used to filter results. Examples include “Active Cases”, “My Cases” and “Resolved Cases”. Views can be system views or personal views. Many system views are provided out of the box but others can be added by developers / customizers. Personal views are saved “advanced finds” and can be created by users. Personal views can be shared with other users and teams in the organization.

All tables in your model driven app will have a set of system views out of the box. With cases you will find that quite a few of useful views exist. The will allow users to view their active cases, resolved cases, cases being followed and much more. I suggest you become familiar with the views available on cases!

From any case view the user can select multiple rows in a view. They then have the ability to perform actions on all the selected rows including delete, bulk edit, merge cases, apply routine rules, assign, add to a queue. If just one case is selected it is possible to resolve or cancel the case directly from the view. (You can NOT resolve or cancel when multiple cases are selected.)

Tip: You can merge cases when multiple rows are selected. I will cover the concept of merging cases in a later post.

Views can be sorted by clicking on column headings. Then selecting the sort method. Typically this will be “A to Z”. But with dates we can also sort “Oldest to Newest”.  Plus you can sort by multiple columns by holding down the shift key and selecting additional columns. An arrow will show in the column heading depicting the direction of the sort, clicking the column again will change the direction of the sort.

We can also use the filter by button to refine the rows shown in our view. (Note: If you have filtered a column, clicking the column heading again will give you a clear filter option.)

Case Dashboards

Another useful way agents can filter cases is to use the case dashboards. The case table has a case specific dashboard enabled by default. To access this you can use the “Open Dashboard” option from any case view.


I suggest you become familiar with how we can use visual filters and more to help agents navigate cases. Showing the visual filters can give the agents a very quick way to navigate to high priority cases (and more).


Additionally agents can perform actions on the cases directly from the dashboard. By clicking “…” they can resolve cases (and more) directly from the dashboard views.

Searching Cases

Cases are commonly searched directly from views by selecting a specific view or using the filter functionality. It is also possible to search for specific cases by typing directly into the quick find search box, by simply entering what is required or by also entering wild cards. Typically, the direct search will use case number or case tittle to locate the case however customizers have the ability to extend this functionality to decide exactly which fields will be included in the search. (By amending the “quick find view” on the case table.)

You should understand that this search will look in the current view to located the required case. So in my example I have searched “All Cases”. But if I had opened the “Active Cases” first then I’d be looking at just open cases.

Additionally cases maybe returned (along with other entities) in the relevance search. You access this style of search from the search box in the command bar. Notice that on the relevance search results will include other entities but you can filter by the case record type if required. Also notice that I can easily filter and sort the results on the search.

Tip:
If you haven’t used the search function recently, I suggest your revision includes some hands on time searching for cases and other entities. (Not least because in recent releases Microsoft have made some improvements in this area!)

It might also be worth noting that you can open and interact with the cases directly from the relevance search. (As shown below.)


Create Cases from Activities

Activities track relevant interactions between a company and their suppliers / customers. There are many situations when you might want to convert an activity into a case, for example a customer may email about an issue with a product. The email can then be converted into a case. Any of the activity types can be converted into a case. (So, Letter, Fax, Phone call, email, appointment or task.) To demonstrate this, below you can see that I can convert a phone call into a case.


In addition to being able to convert out of the box activities you can also covert custom activities into cases. Below is an example of a custom activity “SMS Message”, notice that I still have a “To Case” option.


When converting an activity into a case, you confirm the customer and case subject. (Don’t forget that the customer field could be a contact or an account.) You can also select to automatically open the newly created case record and optionally also close the activity that is the source of the case.


When an activity is converted into a case the originating activity is automatically tracked against the newly created case. You can see this below on a case I have created from my phone call. Notice the originating activity shows in the timeline on the newly created case.


I hope this post has started to explain the basic concepts connected with case management. As I said in my introduction I will split this topic across two blog posts. In my next post I will cover resolving cases, routing cases, parent / child cases and merging cases.

MB-230: Microsoft Dynamics 365 Customer Service – Revision Guide

$
0
0

I have been completing a series of posts to help people prepare for the MB 230 certification (Microsoft Dynamics 365 for Customer Service). Here is a collection of links to all of those posts. I hope these might serve as a useful revision aid for the MB 230 exam.

The MB 230 certification is not easy! I guess because customer service is a big topic covering many aspects of your “CRM” system. Including the ability to raise tickets (aka cases) but also how to automate the processes around those cases. We also have additional topics such as Omnichannel which cover how we engage with our customers. And then (of course) we’ll need to understand how to report on service metrics.

Note: This guide is a work in-progress, please keep checking back for updates.

Introduction

Customer Service Overview

Manage cases and Knowledge Management
Create and manage cases· configure cases

· manage case lists

· create and search for case records

· convert activities to cases

· perform case resolution

· implement parent/child cases

· merge cases

· set autonumbering for customer service entities

Configure and automate cases· implement Advanced Similarity rules

· implement record creation and update rules

· implement case routing rules

· customize the Case Resolution form

· configure Status Reason transitions

· configure business process flows

· capture customer feedback by using Customer Voice

Implement Knowledge Management · configure the Knowledge Search control

· link an article with a case

· use Knowledge Management to resolve cases

· manage the Knowledge Management article lifecycle

· manage Knowledge management articles

· configure entities for Knowledge Management

· manage Knowledge article templates

· implement Knowledge Search

· enable Relevance Search

· configure categories and subjects

· convert cases to knowledge articles

Revision Guides

Case Management – Part One

<<Case Management – Part Two>>

<<Case Automation – Part One>>

<<Case Automation – Part Two>>

<<Knowledge Base>>

<<Customer Voice>>

Manage queues, entitlements, and service-level agreements
Create and manage queues · describe use cases for each queue type

· configure queues

· add cases and activities to queues

· configure entities for queues

· perform queue operations

Create and manage entitlements · configure entitlements

· define and create entitlements

· manage entitlement templates

· activate and deactivate entitlements

· renew or cancel an entitlement

Create and manage SLAs · define and create service-level agreements (SLAs)

· configure SLA settings

· configure a holiday schedule

· configure a customer service schedule

· implement actions by using Power Automate

· manage cases that are associated with SLAs

· manually apply an SLA

· create and manage SLA items

Revision Guides

<<Queues>>

<<Entitlements>>

<<SLAs>>

Implement scheduling
Manage resources · configure business closures

· configure organizational units

· configure resources

· configure work hours

· configure facilities and equipment

Manage Services· define services

· schedule a service activity

· configure fulfilment preferences

· create a schedule board

· schedule a service activity by using the schedule board

Revision Guides

<<Resources>>

<<Services>>

Implement Omnichannel for Customer Service
Deploy Omnichannel for Customer Service
· provision Omnichannel for Customer Service

· define user settings

· configure application setting

· manage queues

· configure skills-based routing

Implement Power Virtual Agents
· describe Power Virtual Agents components and concepts

· integrate Power Virtual Agents with Dynamics 365 Customer Service

· escalate conversations to a live agent

Manage channels · describe use cases for the Channel Integration Framework

· configure channels

· enable the chat widget on websites

· configure pre-chat surveys

· configure proactive chat

· configure Secure Message Service (SMS)

Distribute work
· describe difference between entity routing and channel routing

· configure work streams

· configure entity routing

· configure routing values

· implement context variables

Configure the agent experience
· create macros

· define agent scripts

· configure Quick Responses

· configure sessions and applications

· configure notifications

Configure the supervisor experience
· configure Omnichannel Insights dashboard

· configure intraday insights

· customize KPIs for intraday insights

· enable sentiment analysis

Revision Guides

<<Deploy Omnichannel>>

<<Power Virtual Agents>>

<<Omnichannel channels>>

<<Distribute / Route Work>>

<<Omnichannel Agent Experience>>

<<Omnichannel Supervisor Experience>>

Manage analytics
Configure Customer Service Insights· describe capabilities and use cases for Customer Service Insights dashboards

· connect to Customer Service Insights

· manage workspaces

Create and configure visualizations
· configure interactive dashboards

· design and create charts

· design reports by using the Design wizard

Revision Guides

<<Customer Service Insights>>

<<Customer Service Visualizations>>


Omnichannel for Customer Service – Simplify Navigation

$
0
0

I recently read about the ability to simplify the user navigation in Microsoft’s Dynamics 365 Omnichannel for Customer Service and Customer Workspace apps. I’m all for something that might reduce my mouse clicks! In this post I will explain how I configured this option and give my initial thoughts.

Normally in the multi-session interfaces we have in Omnichannel for Customer Service or Customer Workspace apps user can manually create sessions by holding shift and clicking the mouse on a Dynamics 365 record / hyperlink. Or to open new tabs in an existing session they can hold the Ctrl key and click the mouse.

Don’t get me wrong this navigation works really well. But wouldn’t it be “nice” if users didn’t have to hold the shift or Ctrl keys? And more importantly had to put less thought into when to start sessions or open tabs in existing sessions.

As I have already mentioned we have two apps that contain this multi session interface. In my opinion the biggest benefits might be achieved by applying this idea to the Customer Service Workspace app. My logic being that with Omnichannel as users click on notifications for incoming communications new sessions will automatically start anyway. But with Customer Service Workspace we might not be using the communication channels and simply want to start new sessions as cases or any record is opened from dashboards or views.

You can read Microsoft’s full instructions on enabling this feature using the link below. I will mention the process in this post but I encourage you to read these instructions. And you will also need to copy some code which is best done directly from their instructions.

https://docs.microsoft.com/en-us/dynamics365/customer-service/csw-overview#navigate-and-view-records

Setup Simplified Interface

So first of all I logged into Dynamics 365 and opened my Customer Service Workspace app. I then clicked F12 to open the developer tools and selected the console tab.

Next I copied the code mentioned above into the console tab to run it. After running the code you need to run one of three different commands.

For Customer Service workspace:CSWAppUtility.setMultisessionNavigationImprovementsSetting();

For Omnichannel for Customer Service app:OCAppUtility.setMultisessionNavigationImprovementsSetting();

For both the multisession apps:AppSettingUtility.updateAppSetting(‘msdyn_MultisessionNavigationImprovements’, true);

Below you can see that I ran the command for the customer Service Workspace and soon afterwards received a message that the update was successful.

Testing the change

Before I could test the change I did find I needed to refresh my browser. But once done I my simplified navigation was working. Simples!

As you can see below, from any “home” page clicking on a case (or any record type) will automatically start a session containing that case.

Tip: With the Customer Service Workspace you can define a session template for each entity. So if I want to control what tabs open in the session you could do this using a template.

Now my session is open clicking on any link will open a new tab in that session. Below you can see that I clicked on the customer name and a new tab opened.

But what happens if the user clicks on another account record? …. Well then another tab would open for the second account.

Or what happens if the user clicks on the same account again? …. If the record you are trying to load is already open in your session it doesn’t open another tab. Instead the existing tab is given focus. I really liked that feature!

If you create a new record (using quick create) the quick create form opens on the right hand side, as normal. When you have finished with the form you return to the tab within your session. Whilst in the middle of a quick create the rest of the screen becomes inactive. So you can’t swap sessions whilst creating the new record. (Which felt reasonable to me!)

And if you try and create a new record when no quick create form is available the new form simply opens in a new tab in the current session.

Also, if I am in a session and I would like a new session to open for the record I’m opening. Then I can still use “Shift Click” to start a new session.

Conclusion

So what do I think???

This is a subtle change but it is actually one that will greatly benefit users. I think anyone “just” managing cases or other entities directly in the Customer Service Workspace will find this simplified navigation to be exactly what they’d want. I doubt anyone will want to reverse this change after they’ve enable it.

I will flag that I haven’t tried to disable this style of navigation! And I’m not sure the Microsoft instructions mention an approach for removing this change. So you might want to test this out in a sandbox before upgrading your production instance.

I think mixing this change with session templates and app profiles could result in an excellent experience. If you don’t know about session templates or app profiles I suggest you look them up! Plus I might create a blog post for those soon.

All in all, nice job Microsoft.

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Design data and security model (Part One)

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover topics around designing the data and security models.

Data is a big topic! I therefore plan to split this information into two post. In this first post I will cover an overview of security and data design.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. In terms of the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!


Security Overview

As a Solution Architect we need to consider if there are any security, regulatory or compliance requirements that will impact the solution design. Unfortunately, I have often seen security handled as an afterthought but the Solution Architect should consider security throughout the entire application lifecycle. From design and implementation to deployment and even ongoing into operations.

A discovery phase should review / document existing security measures. This is because a single project is unlikely to change the entire approach to authentication. You should understanding if single sign is already in place. Or fi the customer is using 3rd part authentication products or “just” Azure active directory. And if multi-factor authentication is in place.

It is also important to consider how the organizations structure may influence security models. To give just one example, I once worked with a large insurance company. For them it was critical that the data held by insurance brokers was kept isolated from records held by the insurance underwriting teams. These types of organizational structural requirements could lead the Architect to conclude multiple business units, environments or even tenants are required.

The Architect may need to review / design multiple layers of security. Including;

  1. Azure AD conditional access– blocking / granting system access based on user groups, devices or location.
  2. Environment roles– include user and admin roles (such as global admin). You can read about 365 admin roles here,
  3. Resource permissions for apps, flows, custom connectors etc.
  4. Dataverse (aka CDS) security roles– control access to entities and features within the Dataverse environment.
  5. Restrictions to the 300+ Dataverse connectors– data loss prevention polices (DLP) used to enforce how connectors are used.

You can read about restricting access here.

Security Groups

Within the Power Platform admin center we can optionally associate a Dataverse environment with a security group.

Whenever a license is assigned to a user, by default a user record would be created in all enabled Power Platform environments within the tenant. This could effectively grant them access to all environments within the tenant.

Security groups can be used to limit access to environments. Then only users who are added to the associated security group will be added to the environment as a user. If a user is ever removed from the security group they are automatically disabled.

Note:
If a security group is associated with an existing environment all users in the environment that are not members of the group will therefore be disabled.

Tip:
Whilst security is massively important when presented with a requirement to restrict access to data it is worth questioning if this is really a security requirement. Or is it just filtering of data for convivence? By doing this you should create a solution which is secure but does not include any unnecessary boundaries.

Data Overview

Additionally you may need to consider any specific requirements around data storage. How long must data to retained? Does it need to reside in a specific country? Are there any laws that apply to how the data can be stored / used?

Sometimes regulatory requirements may also impose specific service level agreements or turnaround times. Or dictate that specific parties / governing bodies must be kept informed or involved in certain transactions.

Data should always be considered as a valuable asset. Therefore designing a security model to ensure the proper usage and access to that valuable asset is paramount. Features like Azure Conditional Access and Data Loss Prevention Policies can be enabled. Additionally ensuring proper usage of secrets / certificates for the services that access the data maybe essential.

Below you can see a diagram which highlights the layers of security which are available. When working with Dynamics 365 maybe an Architect tends to have a focus on the security roles found in the Dataverse. But it should also be noted that beyond these roles additional layers of security exist. For example, to give condition access to Azure or restrict access to connectors.

There are numerous standards for data compliance covering industry certifications, data protection and physical security. During the requirement gathering phases the Solution Architect should question which standards are applicable and may need to confirm compliance. Some examples include ISO27001, GDPR etc. Microsoft publish details of various standards and their compliance here.

Design entities and fields

The Solution Architect should lead the data model design. With model-driven apps …. It is not uncommon for my design work to actually start with what data is to be stored and build out from there. Therefore establishing a high-level data architecture for the project can be an essential early task.

It maybe common for the Solution Architect to design the data model at a high-level before other individuals extend their design. For example, the architect might define the core entities and their relationships but maybe the fields within those entities will be defined later by the design team. If this is the case the Architect would still need to review these detailed designs and provide feedback as the detailed data model evolves.

The Dataverse includes the Common Data Model! This is a set of system tables which support many common business scenarios. The Common Data Model (CDM) is open-sourced in GitHub and contains over 260 entities. Many systems and platforms implement the CDM today. These include Dataverse, Azure Data Lake, Power BI dataflows, Azure Data services, Informatica and more. You can find the CDM schema in GitHub here.

In addition to the industry standard CDM schema, Microsoft provide industry specific accelerators aimed at particular vertical markets. Examples include, Healthcare, Non-profit, Education, Finance and Retails. ISVs may then create industry specific apps which leverage these accelerators. You can find out about the accelerators here.

Whenever possible the standard system tables within the Common Data Model should be used. For example, if you need to record details about customers use the account table for that. This will not only make the system quicker and easier to develop but will aid future maintainability. All Architects should avoid re-creating the wheel!

It will be common to create diagrams to illustrate the data model, often called Entity Relationship Diagrams (ERDs). These show the entities (aka tables) and how they relate to other tables.

In my ERDs I like to highlight which entities are out of the box with no change, which are leveraging out of the box tables but with additional custom fields and which are completely custom.

Typically we will be thinking about tables and columns held within the Dataverse (CDS), conceptually it might be easy to think of Dataverse as a traditional database. But the Dataverse is much more than that! As it includes a configurable security model, can support custom business logic that will execute regardless of the application and even stores data differently depending it type. As relational data, audit logs and files (such as email attachments or photos) are all stored differently.

Sometimes, rather than depicting the tables in an ERD the Solution Architect may first create diagrams to illustrate the flow of data within the solution. Without needing to “worry” about the physical implementation. These diagrams are known as logical data models. Only once the logical data model is understood would the Architect then create a physical data model based on the logical model. The physical data model could take the form of an ERD and could include data within Dataverse, Azure Data Lake, connectors or other data stores.

There are several strategies / techniques that can be used to help the Architect when creating a data model;

  • Always start by depicting the core tables and relationships– having a focus on the core tables will avoid getting side-tracked into smaller (less important) parts of the solution.
  • Avoid over normalization– people with a data architecting background may tend to try and build a Dataverse data model with traditional SQL database concepts in mind. A fully normalised database within the Dataverse may have an adverse impact on the user experience!
  • Start with the end in mind– it is often useful to start off by defining the final reporting needs, you can then confirm the data model adequately meets those requirements.
  • Consider what data is required for AI– if you intend on implementing an AI element into your solution design consider what source data will be needed to support any machine learning / AI algorithms.
  • Plan for the future– consider the data model requirements for today but plan for how this might evolve into the future. (But avoid trying to nail every future requirement!)
  • Use a POC– creating a proof of concept to demonstrate how the data might be represented to users can be a valuable exercise. But be prepared that this might mean trying a data model and then throwing it away and starting again.
  • Don’t build what you don’t need– avoid building put parts of the data model you don’t plan to use. It is simple to add columns and tables later, so add them when you know they are required.

Once the tables within your solution design have been considered you will need to consider the detailed design task of creating columns (aka fields). Columns can be of many different data types, some of which have specific considerations. I will mention just a few here;

  • Two options (yes/no)– when you create these ensure you will never need more choices! If unsure maybe opt for a “Choices” column.
  • File and image– allows the storing of files and images into the Dataverse.
  • Customer– a special lookup type that can be either a contact or account.
  • Lookup / Choices (Optionsets)– which is best for your design! Optionsets (now known as Choices) make a simple user experience but lookups to reference data can give more flexibility to add options later.
  • Date / Time – be careful to select the appropriate behaviour. (local, time zone independent, data only)
  • Number fields– we have many variations to select from. Choose wisely!

Other options exist for storing files / images. Having images and documents in the Dataverse might be useful as security permissions would apply and the user experience to complete upload can be “pleasing”. But size limits do apply so storing large files might not be possible. Other options like SharePoint, which is ideal for collaboration exist. Or you could consider storing the files in Azure storage which might be useful for external access or archiving purposes. As part of your revision …. you may need to be aware of the pros / cons of various methods to store files!

Design reference and configuration data

When we create an table in the Power Platform there are some key decisions to make about how the table is created. Some of these cannot be easily changed later! For example, should the rows be user / team owned or organization owned.

User / team owned records have an owner field on every row in the table. This in turn can be used to decide what level of security is applied for the owner and other users. One good out of the box example of a user owned table might be the “contact” table. Each contact in the system is owned by a single user or team. It might then be that only the owner can edit the contact but maybe everyone can see the contact.

Alternatively tables can be organisation owned. With these you either get access or not! The record within the table are often available to everyone in the organization. These tables are ideal for holding reference / configuration data.

Often one consideration when designing reference data is to consider if a Choice (optionset) or a lookup to an organization owned table is the best approach. I find “choices” most useful for short lists which rarely change. Whilst lookups are ideal for longer lists that might evolve overtime. (As users can be granted permissions to maintain the options available in a lookup. But as the Choice column forms part of your solution the development team would need to alter the items in a Choice column.

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Design data and security model (Part Two)

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover topics around designing the data and security models.

Data is a big topic! I therefore plan to split this information into two post. In this second post I will dive deeper into complex scenarios and discuss the more technical aspects of Dataverse (CDS) security.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. In terms of the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!


Design complex scenarios

Often a Dynamics 365 application will simply access data that resides in the Dataverse. However Power Automate, Power BI and Power Apps (Canvas Apps) can leverage data connectors to access data from many hundreds of sources. Additionally custom connectors can be created as required. These connectors allow us to leverage existing data source and services.

When an “off the shelf” connector does not exist a custom connector can be created. This might work by making use of an existing API. Or you may need to create a custom API and define your own actions. The connectors can be make use of OAuth (including Azure AD), API key and basic auth. Connectors can be packaged and deployed via solutions. Creating as connector that is then available by users for re-use is a great way to empower then to extend the solution using the Power Platform.

Data modelling on the Power Platform should therefore look at the whole data architecture picture and include a logical look at data from the Dataverse, Data Lakes and external sources using connectors.

Azure Data Lake is a hyper-scale repository for big data analytics. Azure Data Lake can store data from all disparate sources, including cloud and on-premise sources of any size and structure. The Data Lake can include CDM and can be integrated with Dataverse. The huge scale of the Data Lake may help support complex scenarios like machine learning, AI and complex reporting when extremely large data volumes maybe required.

  • Dataverse– used for transaction data that apps will consume and maintain. Structure typically based on the common data model.
  • Azure Data Lake– ideal for data from other / multiple systems, read focused and can leverage the common data model.
  • Connectors– great for leaving the data where it is. Supports accessing external sources to make their data available in apps.

There are many drivers which may influence data model decisions, including;

  • Security requirements
  • User experience– it’s easy to forget that as we add normalization and relationships we create new constructs users need to navigate to be successful
  • Data location– some data must be stored in a given GEO
  • Retention policies– not all data can be stored indefinitely!
  • Self-service reporting– if the data model becomes complex can a “regular” user still extract the data for their reporting purposes? Or will all reports end up needing a data architect to design them??
  • Roadmap – what are the plans for the future?
  • Existing systems– are we going to connect with any existing systems or does any existing data need to be “squeezed” into the new data model.
  • Localization– Multi-region, multi-lingual, multi-currency requirements

Custom Activities

Out of the box we have activities like phone call, email and task. But custom activities can be created as required. The advantage of creating a custom activity is that they show in the timeline lists with other activities. But you need to be aware that security is applied consistently across all activities. So if a user is given the ability to delete tasks, this would actually mean they can delete any activity type. (Including your custom activity.) Activities can be regarding any table that is enabled for activities, meaning a custom activity could be regarding “anything”. (Although you could consider some control by limiting which entities are available to users in your model-driven app!)

Calculated and Rollup Fields

The Dataverse provides us with options to create calculated or rollup fields.

Calculated fields are populated on the retrieve of records. They are read-only. They can be based on fields within the table or its parent.

Rollup fields are stored on the table. Like calculated fields they are read-only, they are calculated (re-calculated) based on a update schedule or on demand. The values on the parent are rolled up from child records. (1:N only!) Filtering on the related table can be applied.

In complex scenarios …. it is possible to include rollup fields in calculated fields. And it is possible to rollup “simple” calculated fields.

Relationships

Often we will work with one to many (1:N) relationships. So one order can have many order lines etc.

There are situations when you may need a many to many relationship (N:N). For example one patient could be seen by many doctors. And at the same time each doctor would have many patients.

Deciding when an N:N relationship is needed can sometimes be harder than you’d think! Creating a POC of your data model can help identify many to many relationship requirements.

Tip: If you aren’t fully familiar with the types of relationships in the Dataverse and the possible cascade logic then some additional revision into relationships may be beneficial.

You may wish to review this post in which I explain table relationships.

Alternate Keys

You can also define alternate keys. Alternate keys can contain decimal, whole number, text fields, dates and lookup fields. Entities can have up to 5 alternate keys. An index is created behind the scenes which is used to enforce uniqueness. An alternate key can be made up of multiple fields but its total length cannot exceed 900 bytes or 16 columns per key.

Alternate keys are often useful for fields like account number (etc) that may be primary references supplied to / from external systems. And would therefore need to have uniqueness strictly enforced.

Design business unit team structure

Business Units are a fundamental building block in the Dynamics security model. They define the structure of the organization.
Business units provide a great way to manage record access for large numbers of users with permissions being granted to the current business unit and / or its children.

Therefore the first security construct to consider are business units, these allow you to define the structure of your organization. Users, teams and security roles will be linked to business units making them a fundamental building block in the security model. Business units can have a hierarchy. This might reflect the organizations actual hierarchy but that isn’t always the case. In actual fact the hierarchy is designed to “drive” which users get access to what records.

Each business units has a default team that includes all users assigned to that business unit. Each user can only be in one business unit but they can be added to additional teams as required.

Teams have several uses in Dynamics 365 / Power Platform. Teams are essentially lists or groups of users and can be used to help manage security. Teams may also be useful when creating reports. Additionally, teams can be useful when you wish to share “assets” with a group of users. (Assets could include views, charts, dashboards or records.)

Owner teams within the Dataverse can simply include lists of users. But a Dataverse team can also be connected to an Azure Activity Director Office Group or Security Group.

We also have a concept of Access Teams. Access teams are dynamically created on a per record basis. The access granted to the members of the team to the specific record is based on an access team template.

If you need to revised the concepts of business units and teams in greater depth, try reading my post on the subject here.

Tip:
when designing the data model consider its manageability. Creating large numbers of busi
ness units and security roles may give granular control of security options but could also result in a solution which is a nightmare to manage. You should use business units to restrict access as required but do not be tempted to simply match the organisations structure.

Security Hierarchy

An additional approach to security is to use a security hierarchy. With this approach a manager is granted access to the records of their team members. As a multi-level team structure could exist the permissions can be inherited through multiple levels. This way a manager can read records for their direct reports and further into the hierarchy. The manager however can only maintain records owned by their direct reports.

The security hierarchy can be based on a concept of manager or position. With manager the manager field from the systemuser record is used. With a position approach custom positions are defined and users assigned to each position. (The position approach allows crossing of business unit boundaries.)

When defining the hierarchy settings we can decide how many levels to target. The number of levels implemented should typically be kept as low as possible, maybe 4 levels. Although the theoretical maximum number of levels is 100.

Note: You cannot combine manage and position hierarchies at the same time.

Design security roles

Security roles within the Dataverse provide users with access to data. As a general principle users should only be granted access to data that they really need to access.

The Solution Architect will need to decide what approach should be taken to security roles. Dynamics 365 does ship with a number of out of the box security roles, commonly these will be copied and then amended as required. There are some common strategies for building security roles;

  • Position specific– each person who holds a given position is given one specific role. (All users have one role.)
  • Baseline + position– a baseline role is created which grants access to the entities needed by all users. Then an additional role is added specific to the additional requirements for a given position. (All users have at least two roles.)
  • Baseline + capability– again a baseline role is created. Then roles for each capability / feature that might be required. (All users would have multiple roles.)

Tip:
As part of your revision you may wish to consider the advantages and disadvantages of each of the strategies above. Additionally maybe review how security is achieved in your current organization!

Each role is linked to a business unit. The roles then contain many options that govern the access to entities and features for the combination of business unit and user (or team).

Often roles will be inherited from a parent business unit. Meaning when I use the manage roles on a user linked to a child business unit, I still see the roles from the parent business unit. This happens even though I haven’t created any roles specific for the child unit!

Each user (or team) can have multiple roles. The Dynamics 365 security model works on a least restrictive security model. Meaning a “sum” of all the roles assigned to the user would be applied. Additionally when a role is assigned to a team, we can select if the privileges apply to “just” the team or if they get applied directly to the user. Meaning members of the team inherent the privileges as if the role is applied directly to their user record.

Each role is made up of several options (actions) for each table / feature in Dynamics 365. Entities typically have multiple actions covering functions including, create, read, write, delete, append, append to, assign and share.

Each table can be organization owned or user / team owned. Organization owned entities have to security privileges, meaning each user either is isn’t granted access. Whilst user / team owned entities support access be granted to the user, business unit, child business units or organization.

If you need to revised the concepts associated with security roles in greater depth, try reading my post on the subject here.

Design column (aka field) security

Often it will be sufficient to only control access to an table. But it maybe that a particular column on a table contains sensitive data that only certain users can see and maintain. We can hide columns on forms but this does not actually secure the data.

In these scenarios Dynamics 365 and the Power Platform supports column level security.

For example you may store information on contact about their disabilities / special needs. A limited number of users may need access to this information but maybe most users don’t need to see what disabilities a person might have.

You can read about how this operates here.

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Design integrations

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover my revision connected with designing integrations.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. In terms of the exam I would expect them to begin using the term Dataverse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!

The Solution Architect will be required to identify when integrations maybe required. They will lead the design of any integrations and document how they will be included into the overall architecture. It will be the Architect’s role to ensure any integrations do not make the solution “fragile” and may additionally need to consider integrations as part of an overall disaster recovery plan.

There are many reasons we need integrations. Integrations maybe required to provide a consistent interface for users / customers. Maybe real-time integrations are needed to keep disparate systems up-to-date. Often integrations may be required to avoid reinventing the wheel! Reusing an existing system or component with the help of integrations maybe be more cost effective and achievable than reimplementing.

A Power Platform Solution Architect must have a deep technical knowledge of Power Platform and Dynamics 365 apps plus at least a basic knowledge of related Microsoft cloud solutions and other third-party technologies. When designing integrations the Solution Architect will review the requirements and identify which parts can leverage a Dynamics 365 apps and which parts must be custom built by using the Power Platform, Microsoft Azure etc.

Historically an architect might have started with custom development in mind but a Power Platform Solution Architect will typically start their design process with a focus on Dynamics 365 and the Power Platform and only later use Microsoft Azure and other custom approaches to address any gaps.

When considering the design of integrations it maybe worth thinking about how the Dataverse operates. The Dataverse is a software-as-a-service (SaaS) platform, therefore most of its architecture, such as underlying data storage, are abstracted from developers so they can focus on other tasks such as building custom business logic and integrating with other applications.

Note:
Historically we may have used SOAP API to programmatically retrieve data from the Dataverse (CDS). But these days the web API is the preferred approach. If you aren’t already familiar with the web API approach you can read about about it here.

Most solutions do not exist in isolation, they rely on internal and external integrations. A part of identifying solution components the Architect should highlight how any integrations will be handled. We may need to define what tools or services will be used to complete the integrations. And also potentially define clear boundaries on where one system ends and another begins. The Solution Architect should focus on what happens on these boundaries. Often multiple parties will be involved in the development of integrations so clearly defining which boundaries are “owned” by which supplier or internal development teams maybe critical.

Integrations can take many forms.

Data integration– “just” combining data from different sources. Maybe to provide the user with a unified view. Data integrations maybe event base (near real-time) or batch based (maybe with overnight updates).

Application integration– a higher level integration connecting at the application layer.

Process integration– potentially you retain multiple disparate systems but each of those systems remain part of the overall business function.

Design collaboration integration

Often integration may involve collaboration tools such as Outlook, Teams, Yammer and more.

SharePoint for example maybe leveraged for document storage, whist Teams can be created to help groups of users collaborate.

Often the associated collaboration tools will sit outside of the Power Platform and Dynamics 365.Meaning each tool would have its own requirements / constraints in terms to license costs, security considerations and more.

Design Dynamics 365 integration

There are a number of different types of integration / extensibility options available to us when considering the integration of Dynamics 365 applications.

Some of these capabilities are baked into the Power Platform, including business rules, calculated fields, workflow processes and much more. Client-side extensibility can also be created with JavaScript.

Additionally developers can access the transactional pipeline with plugins. (using .net SDK) Or custom workflow activities. As mentioned we can access the Dataverse (CDS) API service using SOAP or web API (OData).

Custom business logic can be created via plugins. This custom logic can be executed synchronously or asynchronously. The results of synchronous calls can be seen by the users immediately but this does mean the user is “blocked” until the call completes. Whilst with asynchronous calls will not block the user. However asynchronous calls can only run post operation. (Synchronous calls can run post or pre-operation.)

Synchronous calls are included in the original transaction but are limited to 2 minutes. Whilst asynchronous customizations would not be part of the original transaction.

FYI: When you use the Dataverse connector in Power Automate or a Canvas App it will be making calls to the OData API. (Power Automate runs asynchronously.) Classic workflows can run synchronously (real-time) or asynchronously.

Any custom logic may run on the client or the server. Examples of client side extensibility would include canvas app formulas, JavaScript on model-driven forms, business rules (with form scope) and the Power Apps component framework. Client side customizations happen in the user interface, meaning the user will see the results immediately. But as in the user interface any customizations will normally only be enforced in the specific client applications.

Whilst server side customizations only happen when the data is sent to the server. Meaning users would only see the results when a data refresh is completed. Server side customizations can be created with plug-ins, Power Automate flows, classic workflows and business rules (with entity scope).

To ensure consistent availability and performance for everyone the platform applies limits to how APIs are used by the Common Data Service. Service protection API limits help ensure that users running applications cannot interfere with each other based on resource constraints. These limits should not affect normal users of the platform. You can read an overview of API limits here.

API requests within the Power Platform consist of various actions which a user makes. Including;

  • Connectors– API requests from connectors in Power Apps and Power Automate
  • Power Automate – all power automate step actions result in API requests
  • Dataverse (Common Data Service)– all create, read, update and delete (CRUD) operations. Plus “special” operations like share and assign.

You can read more about APIs and their limits here. Because limits exist the Solution Architect may need to optimize integrations to minimize API calls. Additionally data integration applications may need to handle API limit errors, this is done by implementing a strategy to retry operations if an API limit error is received. You can read about the service protection and API limits here.

Design internal system integration

Internal app builders / customizers maybe able to leverage low code approaches to support internal integrations. Maybe these integrations can be created using canvas apps or power automate flows with standard or custom connectors. Additionally canvas apps maybe embedded into model-driven apps to leverage connectors.

Power Automate maybe used for event based data integrations especially if integrations are needed between multiple Dataverse environments.

Design third-party integration

Developers maybe required to assist with 3rd party integrations. Maybe this will involve creating custom APIs, developing plug-ins or using other technologies such as Azure Logic Apps or Azure Service Bus.

Virtual entities may also give visibility of external data sources directly in model driven apps. Or maybe custom UI controls can be created to surface 3rd party data using the PowerApps Component Framework (PCF).

With inbound data integrations 3rd party tools such as KingswaySoft or Scribe maybe used. When using such tools performance / throughput may need to be considered, you possibly need to design multiple threads to overcome latency effects.

Web application integrations maybe created using webhooks. Where custom call backs maybe triggered using JSON in a http post request. Webhooks maybe synchronous or asynchronous.

Design authentication strategy

The systems into which integrations are required may demand differing approaches to authentication.

When working in a batch mode for updating data the timing of authentication requests may also be a consideration which could impact performance. Maybe you shouldn’t authenticate on every request!

OAuth

OAuth is a standard that apps can use to provide client applications with “secure delegated access”. OAuth works over HTTPS and authorizes devices, APIs, servers, and applications with access tokens rather than credentials. Authentication integration maybe provided with OAuth providers such as Azure Active Directory, Facebook, Google, Twitter and Microsoft accounts. With basic authentication the user would always have to provide a username and password. With OAuth the user sends an API key ID and secret.

Design business continuity strategy

In terms of availability and disaster recovery the Power Platform components should handle concerns with internal integrations. Therefore the Solution Architect may need to focus on external integrations. Additionally manual environment backup / restore operations can be used to help with “self-created” problems like bad deployments or mass data corruptions.

Separation of applications can reduce their reliance on each other. Using Azure Service Bus may allow integrations between systems in a decoupled manner.

Design integrations with Microsoft Azure

Webhooks can only scale when the host application can handle the volume. Azure Service Bus and Azure Event Hubs maybe used for high scale processing / queuing of requests. Although the Azure approach can only be asynchronous. (Webhooks can be synchronous or asynchronous.)

Azure Service Bus

CDS supports integration with Azure Service Bus. Developers can register plug-ins with Common Data Service that can pass runtime message data, known as the execution context, to one or more Azure solutions in the cloud. AzureAzure Service Bus integrations provide a secure and reliable communication channel between the common data service runtime data and external cloud-based line-of-business applications. You can read more about Azure integration here.

Azure Service Bus distributes messages to multiple independent backend systems, decoupling the applications.

Azure Service Bus can protect the application from temporary peaks.

Azure Event Hubs

Azure Event Hubs is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.

The following scenarios are some of the scenarios where you can use Event Hubs:

  • Anomaly detection (fraud/outliers)
  • Application logging
  • Analytics pipelines, such as clickstreams
  • Live dashboarding
  • Archiving data
  • Transaction processing
  • User telemetry processing
  • Device telemetry streaming

Event Hubs provides a distributed stream processing platform with low latency and seamless integration, with data and analytics services inside and outside Azure to build your complete big data pipeline.

Event Hubs represents the “front door” for an event pipeline, often called an event ingestor in solution architectures. An event ingestor is a component or service that sits between event publishers and event consumers to decouple the production of an event stream from the consumption of those events. Event Hubs provides a unified streaming platform with time retention buffer, decoupling event producers from event consumers.

Azure Logic Apps

Azure Logic Apps is a cloud service that helps you schedule, automate, and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations. Logic Apps simplifies how you design and build scalable solutions for app integration, data integration, system integration, enterprise application integration (EAI), and business-to-business (B2B) communication, whether in the cloud, on premises, or both.

Every logic app workflow starts with a trigger, which fires when a specific event happens, or when new available data meets specific criteria. Many triggers provided by the connectors in Logic Apps include basic scheduling capabilities so that you can set up how regularly your workloads run.

In many ways Azure Logic Apps and Power Automate Flows have a lot in common. However; Power Automate Flows can be packaged as part of a CDS solution. And Power Automate CDS connector has more capabilities. Plus Power Automate allows UI Automation.

Azure Functions

Azure Functions allows you to run small pieces of code (called “functions”) without worrying about application infrastructure. With Azure Functions, the cloud infrastructure provides all the up-to-date servers you need to keep your application running at scale.

A function is “triggered” by a specific type of event. Supported triggers include responding to changes in data, responding to messages, running on a schedule, or as the result of an HTTP request.

While you can always code directly against a myriad of services, integrating with other services is streamlined by using bindings. Bindings give you declarative access to a wide variety of Azure and third-party services.

Azure functions are serverless applications.

Choosing between Power Automate, Azure Logic Apps, Azure Functions and Azure App Service Webjobs maybe confusing! As all of these solve integration problems and automate business processes. This link may help you compare these options!

MB 600: Microsoft Dynamics 365 + Power Platform Solution Architect – Validate the solution design

$
0
0

I am currently preparing to take the Microsoft Dynamics 365 + Power Platform Solution Architect certification. Or “MB 600” for short! As I prepare I plan to write revision notes in this post I will cover everything that might fall under the heading “validate the solution design”.

Note: Microsoft have recently updated some of their terminology. Therefore we should consider the terms like CDS and Dataverse as interchangeable. For the exam I would expect them to begin using the term DataVerse at some point but that change is unlikely to be immediate. Meaning in the short term either term could be used. I created this blog post before the revised terms were announced. I have tried to apply updates but a few “outdated” references may still exist!

The Solution Architect will work with the Quality Assurance (QA) team to ensure testing includes all parts of the architecture. This could include functional testing but may also include other types of testing such as disaster recovery and performance testing.

Proper testing is essential to ensure project success. The Architect is often one of the key people who knows the solution the best and therefore can guide the test team on how to validate it. Testing shouldn’t be considered something that just happens towards the end of a project, testing must be an ongoing effort from the first component built until go live. It is now a one-time big exercise it is an iterative, repetitive task that happens thru out the project life cycle.

Evaluate detail designs and implementation

By the implementation (build) phase the Solution Architect has set the path the implementation team will follow. By this point the Architect will have created a high level solution design document (SDD) and will have reviewed any associated detailed designs. Meaning that during implementation the role of the Architect will shift to be a supporting one for the Project Manager and Delivery Architect. As they will be responsible in ensuring the developers create the solution as defined. This includes facilitating reviews with the team to ensure implementation is meeting the architecture as well as reviews with the customer to ensure the solution is meeting their stated requirements.

During the build problems will happen! Therefore the Solution Architect is also involved in problem solving as they are often one of the few people who understand all the “moving parts” involved in the solution. Some of those “problems” maybe found by the testing team and the multiple types of testing they may complete.

There are many types of testing, I will highlight some of the common types below;

Test TypeDetails
Units TestsTypically the unit tests will be created and run by the application builder or developer. Unit tests confirm the correct operation of each component in isolation.

It will be a common practice for everyone to check their own work before handing off. Manual testing maybe completed for all apps, business rules and plug-ins. (etc.) Some tests can be automated using Power Apps Test Studio and Visual Studio.

Functional TestsFunctional testing verifies that the implementation meets the requirements.
Acceptance TestsAcceptance testing is completed by the end users. Although I have often seen test teams support the business in this task. The purpose / outcome of an acceptance test is the formal approval that the solution is fit for purpose.
Regression TestsTests completed on non-changed functions. Regression tests typically aim to prove that nothing has been “broken by accident”.
Integration TestsVerification that everything works together. Including various internal systems and any integrated 3rd party services.

The Solution Architect maybe called upon to help the test team understand how to test integrated components.

External systems will need to be available for testing. Sometimes they may require careful preparation to ensure real data is not impacted. For example, email integration testing must ensure messages aren’t inadvertently sent to real customers!

Performance TestsConfirmation that the system can cope with expected peak loads. (And maybe “slightly” beyond the stated peak loads. Can the system cope with future growth??)

You may need to define performance “hotspots” to test. These maybe areas of the system known to be complex or those where the business has expressed specific performance expectations. Additionally requirements gathering might need to establish what peak volumes look like. Occasionally you may even have contractual obligations to test, it maybe that the contract specifies the system must support “n” users or cope with “y” transactions etc etc.

You may need to include the infrastructure as part of performance testing. For example, network traffic at remote offices may need to be measured including network latency and bandwidth.

Migration TestsPractice data migrations to ensure data quality.
Disaster Recovery TestsHaving a disaster recovery plan is great but it is useless unless you can test it works.
Go Live TestsIt is common to complete multiple “dry runs” of the go live process. This might involve data migrations tests!

Note:
Performance testing may make use of Azure App Insights

Azure Application Insights, a feature of Azure Monitor, is an extensible Application Performance Management (APM) service for developers and DevOps professionals. Use it to monitor your live applications. It will automatically detect performance anomalies, and includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your app. It’s designed to help you continuously improve performance and usability. You can find out more about Azure App Insights here.

Validate security

Having collected your security requirements, designed a solution and built it …. You will need to test that security is being achieved in the required manner.

In terms of the Power Platform application this can require a significant testing effort. As test scripts will be required that must be repeatedly run against each user persona. (or set of security roles if you like.) This effort should not be under estimated.

We should also consider security beyond the Power Platform. For example, we might be using SharePoint for document management integrated in with our solution. Meaning access to SharePoint may need to be tested independently of our Power Platform solution.

Viewing all 1692 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>