Category: Microsoft

Microsoft Teams Rooms for modern meetings

How easy is it at your company to start a Teams or Skype meeting online in your conference room without technical difficulties? Maybe you have a very large (and expensive) video conference system in your board room but you wish you also could equip the smaller huddle rooms with such systems? Then you should look into Microsoft Teams Rooms which is the new name for Skype Room Systems.

You cant  argue the trend of moving to a more modern and mobile workplace. In a few years, more and more employees will probably not be stationed at a certain office or desk. This requires better tools and services and a big part of this is the digital meetings. We have during the last 3 years seen a massive growth, installing more video conference rooms than the last 30 years and we have seen a switch moving from proprietary (and expensive) solutions to standardized and more affordable systems so even the smallest huddle room can get one…

In it’s simplest form, you book the room in Outlook as you have done for years and you choose if it should be a Teams or a Skype meeting:

When you enter the conference room, the control unit on the table lights up and show you the upcoming meetings:

All you have to do is to click Join on your meeting and within a few seconds the meetings is started, all participants are joined, no matter if it’s via the Teams/Skype client, the web client, app on their phone or have dialed in to the number in the invitation. You see the participants on the control unit and on the bigscreen in front of the room and of course their video if they share it. From the control unit you can mute/unmute and and instantly add participants to the meeting from the directory or call them.

Want to share your screen? Simple, just plug in the HDMI cable to your laptop and it will output to the bigscreen but also share it in the meeting with remote participants. Of course, remote participants can also share their screen in the meeting.

It’s the simplicity – one-click-join and the meeting is started. You no longer need to be a technician to get a meeting started, choosing the correct input on the bigscreen, choose the right speaker and mic.

Microsoft Team Rooms comes from different partners (Logitech, HP, Lenovo, Creston, Polycom, Yealink) which have certified systems in different sizes – from the smallest 4-people huddle room to the largest boardroom. A few examples:

Xenit has used Skype Room Systems for a long time and are extremely happy how it works.

So what about the tech and for IT?

Compared to other proprietary systems, Microsoft Teams Rooms run on Windows 10 with an Windows app. This means you can use your current tools for deploying and managing it as you would do for any other Windows client except that you need to make sure not all policies apply to the system. On-premise AD join, Azure AD join and Workgroup are all supported. The app itself, which only installs on certified devices so you can’t do this DIY, is automatically updated through the Windows Store. So for us at Xenit, it has been almost no support for this system since it was first setup – except for some occasional hardware issues where someone was “smart” to disconnect the HDMI cabling to connect it directly to their laptop.

Of course, Microsoft has done some work to cloud enable these devices if you want.

For example you can use Azure OMS (Operations Management Suite) to monitor these devices since they log a lot of information to the event log. For example you can get information regarding:

  • Active / Inactive Devices
  • Devices which experienced hardware / applications issues (disconnected cables anyone?)
  • Application versions installed
  • Devices where the application had to be restarted

All this can be alerted upon so you hopefully can solve problems before someone calls it in as a problem.

In a few months, the Microsoft Teams Rooms will light up in the Teams Admin Center for additional functionality. For example, if you enroll many of these devices, the admin center will enable you to more quickly enroll them with a profile with settings you want. It will also make it easier for inventory management, updates, monitoring and reporting.

Here’s a short demo:

Let us know if you want to discuss or even get a personal demo at our office.



Easily analyse your memory dumps

Recently I stumbled over a great application for debugging your system while trying to examine a memory dump. The application is named WinDbg Preview and is distributed by Microsoft themselves and serves several purposes for debugging Windows operating systems.

WinDbg Preview is a modernized version of WinDbg and extremely easy to use! With WinDbg Preview you can for example do the following:

  • Debug executables
  • Debug dump and trace files
  • Debug app packages
  • Debug scripts

WinDbg Preview

In my use case I wanted to quickly analyse a memory dump file which had been generated. A minute and about five clicks later I had received an analysis which gave me all the information I needed. I was also told which commands to use on the go without thinking.

Attaching memory dump file

Analysis result

WinDbg Preview is available from the Windows Store and can be read more about it here.

If you have any questions, feel free to email me at robert.skyllberg@xenit.se or comment down below.



Changing default ADFS Decrypt/Signing Certificate lifetime from 1 year to X years

ADFS 2.0 and above versions have a feature called AutoCertificateRollover that will automatically updates the Decrypt and Signing certificates in ADFS, and by default these certificates will have a lifetime of 1 year. If you have federations (Relying Party Trusts) configured and the Service Provider (SP) is not using the ADFS metadata file to keep their configuration updated when ADFS changes occur, then the ADFS administrator will have to notify these Service Providers of the new Decrypt/Signing certificate thumbprints each time time the ADFS servers automatically renews the certificates.

To minimize the frequency of above task you can configure the default lifetime of the Decrypt and Signing certificates so you only have to do it every X years instead of every 1 year.

Below is the ADFS 3.0 Powershell configuration you can run to change the default lifetime to 5 years.

 

See below for how it should look with new Secondary certificates created with a lifetime of 5 years. When the date 3/23/2019 is reached, the ADFS server will automatically activate the (currently) Secondary certificates and update its metadata file accordingly. For any federations that do not use the ADFS metadata file those SPs will have to update the decrypt/signing certificate thumbprints on their side on this particular date (and specific hour, to minimize any downtime of the federation trust).

If you have any questions or comments on above, feel free to leave a message here or email me directly at rasmus.kindberg@xenit.se.

 



Simplify removing of distributed content with the help of Powershell

Begin

TLDR; Go to the Process block.

Ever since I first got introduced to Powershell, I have always tried to come up with ways to include, facilitate and apply it to my my everyday tasks. But for me, using Powershell in combination with SCCM has never been the ultimate combination, the built in cmdlets doesn’t always do it for me, and the gui is most of the times easier to understand.

So when I got a request to simplify removal of distributed content on all distribution points or all distribution point groups, it left me with two options. To create a script what did the desired job, or to create a function that would cover all the possible scenarios. So I thought; “Why don’t I take these matters in my own hands and create what I actually desire?” That is why I created a script that helped to find the content wanted for removal, and to have the distributed content removed from every Distribution Point or Distribution Point Group.

Lets say that you have 10 Distribution Points, and you have distributed content to 5 out of 10, and you have not been using a Distribution Point Group, the way to go would be to repeatedly proceed with the following steps:


And to do these steps for every distribution point would just take forever. Of course, using one Distribution Point Group would of course be more effective and the ideal way to go, but you might have distributed it to multiple Distribution Point Groups? That is something that already has been thought of, and that is why this script is created. Even if you have distributed it to some distribution points, and some distribution point groups, it will all be removed.

Process

But how does it work? In this demonstration, I will have two packages distributed with similar names. One of them will be sent to a Distribution Point Group, and the other one to 2 Distribution Points. And I would like to have both of them removed from whatever they have been distributed to. 
1. Start by launching Powershell, and import the script by running “. .\Remove-CMAllSiteContent.ps1”

2. Run the script with the required parameters. As shown in the picture below, I searched for ‘TestCM’, but it resulted in showing multiple results. The search is done with wildcard, so everything similar to the stated PackageName will be found. All the parameters have a more detailed description in the script below.

  • The search can either be done with the parameter -PackageName or -PackageID,
  • The parameter -PackageName is searching with wildcards both at the beginning and the end of the stated name. This should be used when you are not sure of the PackageID, or want to remove multiple packages, 
  • The parameter -PackageID is the unique ID for the specific package you want to remove from the distribution point(s) or group(s). This should be used when you are sure of what you would like to remove,
  • The parameter -CMSiteCode is mandatory and must be specified. 

3. In this case, I would like to remove both of the displaying packages, so I choose 0 for ‘All’, followed by a confirmation (Y / N is not case sensitive)

4. After it has been confirmed, the script will check the following:

  • If the content is distributed to Distribution Point Group(s) as an Application,
  • If not, check if it distributed to Distribution Point Group(s) as a Package,
  • If none of these is correct, the script will check if the content is distributed on each Distribution Point as an Application,
  • If not, it will check if the content is distributed to each Distribution Point as a Package.

At the beginning of the script, the content is validated as distributed. If not, it will not be shown. These four steps above covers all distributed scenarios.

5. When finished, we can see that the Distributed content successfully has been removed.

Please read the comment based help to get a better understanding of what is actually running in the background.

End

This can of course be modified with more choices in every step, but at the moment I did not see the need for it.

If anyone have any questions or just want to discuss their point of view regarding this post, I would be more than happy to have a dialogue. Please email me at johan.nilsson@xenit.se or comment below.



Querying Microsoft Graph with Powershell, the easy way

Microsoft Graph is a very powerful tool to query organization data, and it’s also really easy to do using Graph explorer but it’s not built for automation.
While the concept I’m presenting in this blogpost isn’t something entirely new, I believe my take on it is more elegant and efficient than what I’ve seen other people use.

So, what am I bringing to the table?

  • Zero dependancies to Azure modules, .net Core & Linux compatibility!
  • Recursive/paging processing of Graph data (without the need for FollowRelLink, currently only available in powershell 6.0)
  • Authenticates using an Azure AD Application/service principal
  • REST compatible (Get/Put/Post/Patch/Delete)
  • Supports json-batch jobs
  • Supports automatic token refresh. Used for extremely long paging jobs
  • Accepts Application ID & Secret as a pscredential object, which allows the use of Credential stores in Azure automation or use of Get-Credential instead of writing credentials in plaintext

Sounds great, but what do I need to do in order to query the Graph API?

First things first, create a Azure AD application, register a service principal and delegate Microsoft Graph/Graph API permissions.
Plenty of people has done this, so I won’t provide an in-depth guide. Instead we’re going to walk through how to use the functions line-by-line.

When we have an Azure AD Application we need to build a credential object using the service principal appid and secret.

Then we aquire a token, here we require a tenantID in order to let Azure know the context of the authorization token request.

Once a token is aquired, we are ready to call the Graph API. So let’s list all users in the organization.

In the response, we see a value property which contains the first 100 users in the organization.
At this point some of you might ask, why only 100? Well that’s the default limit on graph queries, but this can be expanded by using a $top filter on the uri which allows you to query up to 999 users at the same time.

The cool thing with my function is that it detects if your query doesn’t return all the data (has a follow link) and gives a warning in the console.

So, we just add $top=999 and use the recursive parameter to get them all!

What if I want to get $top=1 (wat?) users, but recursive? Surely my token will expire after 15 minutes of querying?

Well, yes. That’s why we can pass a tokenrefresh and credentials right into the function and never worry about tokens expiring!

What if I want to delete a user?

That works as well. Simply change the method (Default = GET) to DELETE and go!

Deleting users is fun and all, but how do we create a user?

Define the user details in the body and use the POST method.

What about json-batching, and why is that important?

Json-batching is basically up to 20 unique queries in a single call. Many organizations have thousands of users, if not hundreds of thousands of users, and that adds up since much of the queries need to be run against individual users. And that takes time. Executing jobs with json-batching that used to take 1 hour now takes about 3 minutes to run. 8 hours long jobs now takes about 24 minutes. If you’re not already sold on json-batching then I have no idea why you’re still reading this post.

This can be used statically by creating a body with embedded queries, or as in the example below, dynamically. We have all users flat in a $users variable. Then we determine how many times we need to run the loop and build a $body json object with 20 requests in a single query, then we run the query using the $batch operation and POST method and put them into a $responses array and tada! We’ve made the querying of Graph 20x more efficient.

Sounds cool, what more can I do?

Almost anything related to the Office 365 suite. Check out the technical resources and documentation for more information. Microsoft is constantly updating and expanding the api functionality. Scroll down for the functions, should work on Powershell 4 and up!

Technical resources:

Creating an Azure AD application
https://www.google.com/search?q=create+azure+ad+application

Graph API
https://docs.microsoft.com/en-gb/graph/use-the-api

About batch requests
https://docs.microsoft.com/en-gb/graph/json-batching

Known issues with Graph API
https://docs.microsoft.com/en-gb/graph/known-issues

Thanks to:
https://blogs.technet.microsoft.com/cloudlojik/2018/06/29/connecting-to-microsoft-graph-with-a-native-app-using-powershell/
https://medium.com/@mauridb/calling-azure-rest-api-via-curl-eb10a06127

Functions