Microsoft Ignite 2022 Review | A good start, but hopefully not repeated

Before we talk about Ignite 2022, let’s layout my personal biases first. I have attended 16 Microsoft TechEd/Ignite Conferences, not having missed one starting in 2004. A fact I hold with a lot of pride. I largely attribute my professional achievements and skills to these events. While it’s always fun to watch newbies miss the important parts of the event for parties, free beer, expo swag, etc., the real professionals use these events to stay ahead of the constant state of change in the industry. One more bias, I do have a distaste for Microsoft Marketing in recent years. They overachieve to sell products that don’t exist yet, or ones that do, but don’t function at the levels they claim. More why that matters later. 

I am a Microsoft Administrator of 20 years who witnessed the fall of Novell, deployed NT4 and then AD2000+. Fast-forward to now:  ahead of the curve with thousands of Azure AD joined (not hybrid) workstation clients, over 200 SAML SSO apps via Azure AD, MFA with Number Matching (aka Password-less) as part of the onboarding process, and almost no VPN requirements for my end users. 

Sadly, I must admit I didn’t get enough out of the event. What upsets me most about this fact, is that next year is slated to be another two-day format which is core to why I didn’t get enough out of the event this year. 

“Two days isn’t enough.”

Both mornings had keynotes that consumed over 25% of the entire event. Those keynotes can help provide guidance on where Microsoft thinks it is going. However, it’s mostly just Microsoft Marketing laying groundwork for hopefully self-fulfilling prophecies. Increasingly, they are selling ideas or concepts of future features which aren’t even in private preview yet. The note on company direction is useful, but not at the cost of 25% of the event.

Mr. Nadella (Microsoft CEO) couldn’t even be bothered to deliver the keynote in person, even though the event was in his home state. This felt like a major smack in the face to those like me that flew 6 hours to attend the event in person and speaks to just how little importance that half day+ worth of keynote actually mattered. 

Let’s cover the three most critical components of Ignite. The deep technical sessions, the face time with product managers/engineers, and the networking with peers. 

Not enough Deep Technical Sessions

There is always an internal fight between event planners and speakers over the technical depth and time length of a session. Level 200 sessions are almost always useless to me. It’s a sales pitch, I don’t need the sales pitch, I’m already sold, I need to know how to deploy and manage the solution. 

Far too often the needed real-world knowledge doesn’t make it into docs.microsoft.com (which is perpetually outdated, incorrect, or simply missing critical details as a byproduct of Microsoft’s newfound agility). The level 300/400 sessions are hosted by PMs, Engineers, and MVPs. These professionals always deliver value without the filter of marketing’s specter, and they provide enough tactical information to actually start deploying solutions (or avoiding the gotchas). 

There were not enough deep technical sessions. This gets back to my point that a day and a half isn’t enough time to cover all Microsoft product solutions that I need to be an expert in. There wasn’t even a specific session about Microsoft Teams Shared Channels, and that’s the exact kind of session I needed and expected this year.

Face time with Product teams” 

The next most important feature of these events is face time with product managers and engineers. It’s where I can really get straight answers. Its access so pure and helpful to solving our design issues or providing critical feedback that tends to have an impact on future releases. 

I had almost no face time this year, which was infuriating. There wasn’t enough expo space for each product team to have their own area. Instead, a scheme was devised to use that precious day and a half of session time to have “ask the expert” time windows where a given product team might be in a specific area for about 2 hours. If those two hours overlapped with a not-to-be-missed-session you ended up having to choose. 

Opportunities to Network with Peers wasn’t as prevalent as it should have been” 

There were very short periods of time between sessions. This left little time to strike up conversations with people I was sitting next to. Also, the meals were so basic that people didn’t spend a lot of time at meals nor was there even an hour to eat lunch. The lack of a proper vendor expo hall made this worse as there was no reason to stick around for the end of day free drinks and snacks. 

Cost and Time Constrained”

I gave the Microsoft events team a lot of leeway for this event. I wouldn’t be shocked if they didn’t know if they were going to put on the event at the start of 2022. This short window of time to throw the event together caused things like no swag. To be clear, I don’t care if I get a 17th backpack and in fact my wife will be thrilled to not have to make me pick one to toss this year. But a lot of people were wondering if Microsoft was being cheap with no swag. I don’t think so for that, I think logistically they couldn’t pull it off. 

But on the topic of cheap, I wonder how much the event budget played a part. Less attendees, far less vendors, perhaps many of the issues like length of event, lack of enough large session rooms, not enough space in the hub for all product teams to have a home base, or even lack of enough proper sessions – can these all be blamed on cost? 

“This was a v1 Hybrid Infant event”

Microsoft Event staff seem downright giddy about flushing out this half in person half online format. I had heard comments about perhaps future years would have multiple in person locations and sessions broadcasted to other locations and to remote users. 

While I think the idea is “cool”, I think the event staff are losing sight of what the event “should” be. I get this awful feeling the event is turning into one big sales pitch instead of what it “needs” to be for education. More now than ever before, the lack of authored books or proper documentation coming from the product teams means this event must fill the gap. That is, if Microsoft wants to see its customer deploy its new solutions. 

One misconception that was abundantly clear, was the idea that we would waste part of our day and a half of session time watching the online only content. While it’s true, many of the sessions were recorded or were online only, I think that skips an important fact:  after this week my carriage turns back into a pumpkin. I will be thrusted into never ending backlogs and my time for skills advancement will be over.

Speaking of the rushed chaotic nature of the event,  I was not the only person who thought there were sessions on Friday the 14th. With this misunderstanding, I booked my travel home on Saturday (#NoRedEyes). That left me in Seattle for the whole day with no event to go to. I ended up finding great spots in Starbucks Roastery and the Seattle public Library to get through as many recorded sessions as I could. At 1.25x play speeds, and armed with skip 10 seconds ahead, I did in fact get through 10 of them. That is far more than the other days. If we can’t persuade Microsoft to bring back the 4.5-day format, I would likely book through Saturday again next year, just so I have that one last day to learn more. 

For 2023 I would like to see the event restored to almost 5 days. This would ensure enough time to jam in all those level 200 keynotes / sales pitches and leave room for the level 300 sessions for my colleague’s needs. They also need a big enough hub so that each product team has a defined space, and they need to force those experts to be in that space during end of day drinks and food. They need larger session rooms, and more of them. They need to encourage more MVPs to submit technical session ideas. Better yet, they should open up to customers asking what sessions they would like to see (Microsoft Teams Shared Calendars cough cough). They need to make the gap in between sessions larger, at least 30 minutes, and a full hour for lunch so there is time to go into the hub. They need to run sessions until 6pm and start them earlier (like they did in past years).  

To be clear, I learned things, just not enough for a whole year. Almost like Moore’s law, the rate of change in M365/Azure is accelerating year after year, and I’m getting more staff to manage it all. I need more technical information to be as successful as in previous years. Like I said, this year was the first one back. Microsoft gets a pass this year, but next year can’t be like this year, or I fear I won’t be able to keep at the bleeding edge of innovation and security at my company. 

Stop Chrome (or any app) from preventing Screen Locking and/or Screen Saver

A minor problem that has plagued me for some time, I would be done for the day, leave the home office, and yet hours later all 4 screens were still left on. I hate paying for the power to leave my screens on all night plus the fact it reduces the screens longevity. Most importantly, its a security issue. I want my computer to lock when I am not at it. Many times I press Win+L to lock but sometimes I forget.

I generally leave my security cams up on the top screen, and I was fairly sure Chrome has a way of telling Windows to not go to sleep because media was playing. Well, I was right.

Detecting the Issue

Simply run this command to see what is holding up the system:

powercfg /requests

Notice there under DISPLAY: that Chrome is playing video?

The Fix

The block Chrome from preventing the computer from sleeping simply run this command (change it from chrome to another app name if its not chrome):

powercfg -requestsoverride PROCESS chrome.exe awaymode display system

Enjoy,

-Eric

PowerShell Error: The underlying connection was closed: An unexpected error occurred on a send

I got mad the other day, trying to do a simple wget (i.e. invoke-webrequest) to an Azure Function I made and I was getting:

The underlying connection was closed: An unexpected error occurred on a send

I tried switching to .NET Webclient but still same error.

What was more frustrating is that it worked on my dev machine, worked on the server I was running to code on in a browser, just not in powershell.

The Fix

Apparently PowerShell version 5 defaults to TLS 1.0. Azure Functions require TLS 1.2. The fix is super simple, just add this in your code on its own line:

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Record Hyper-V Console

Every few months thanks to Windows 10 its time to roll out a new image. This is a simple yet tedious task.  Thanks to modern day multi-tasking its easy to miss something in testing of new images. Then I have to restart the whole process wasting time.

This script will record the screen by taking screenshots every second. I suppose you could use a 3rd party tool to merge them together into a video if you really needed to.

The script includes de-duplication of images so if the screen stops moving so does the recording. That plus using JPEG format makes the images fairly small.

Oh yes, dont forget to “Run as Admin”

Enjoy!

Big thanks to Ben Armstrong for the original work on this script:

https://blogs.msdn.microsoft.com/virtual_pc_guy/2016/05/27/capturing-a-hyper-v-vm-screen-to-a-file/

Windows Update Stuck on “Searching for Updates” on Windows Server 2012 R2

This one was a nightmare. If you search the internet for “Searching for Updates” you will find a lot of pages but none that I saw had this resolution.

In my case on my server the problem was actually related to Flash updates. After working with Microsoft Support it was discovered that a large number of pending Adobe Flash updates were causing the search to never finish so the fix was to manually update Flash.  This was done by installing KB3214628

Hope this helps someone else out, this took MS Support weeks to figure out.

-Eric

Query Azure SQL Database Table via Powershell

Real quick one… I have used similar code for ages to query local on-premise SQL databases. However Azure requires the use of encrypted connections.  Here is some fully working code:

#Set Defaults (Optional) which allows you to skip defining instance, user, and password each time
$AzureDefaultInstanceName = “myUniqueAzureSQLDBName”
$AzureDefaultUserID = “myUserIDToAzureSQL”
$AzureDefaultPassword = “myPasswordToAzureSQL”

#The actual function
Function get-azureSQL (
[string]$InstanceName = $AzureDefaultInstanceName
,[string]$UserID = $AzureDefaultUserID
,[string]$Password = $AzureDefaultPassword
,[string]$Query){

$connectionString = “Server=tcp:$($InstanceName).database.windows.net,1433;”
$connectionString = $connectionString + “Database=$($InstanceName);”
$connectionString = $connectionString + “User ID=$($UserID)@$($InstanceName);”
$connectionString = $connectionString + “Password=$($Password);”
$connectionString = $connectionString + “Encrypt=True;”
$connectionString = $connectionString + “TrustServerCertificate=False;”
$connectionString = $connectionString + “Connection Timeout=30;”

$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $connectionString

$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $Query
$SqlCmd.Connection = $SqlConnection

$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd

$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet) | Out-Null
$SqlConnection.Close()

return $DataSet.Tables[0]
}

#Querying Azure SQL using Defaults defined above
get-azureSQL -Query “select * from logs”

#Querying Azure SQL without Defaults
get-azureSQL -InstanceName “myUniqueAzureSQLDBName” -UserID “myUserIDToAzureSQL” -Password “myPasswordToAzureSQL” -Query “select * from logs”

Sure you can install the Azure Powershell module and then the SQL commands too but most of the time you need to get in quick and grab something, this code is super fast and works every time for me and best of all…. no installs.

If this helped you or you want to suggest an improvement, please just leave it in the commands.

Enjoy,

-Eric

Secure PowerShell Scripts running via Windows Task Scheduler using MD5 Hashes to safeguard against Tampering

Over the years the number of Task Scheduled based PowerShell scripts has increased. However, this poses serious potential security risks.

The Security Issue

Given that these tasks commonly run as a service account, with additional rights, it is a potential attack vector.

Simply changing the underlying script can allow a hacker access to anything the service account has access to.Even signing the scripts can be useless as the system can be configured to ignore signing.

The Solution

I have created this one-liner that Task Scheduler can use that will only run the script if the hash of the script matches the hash listed in the one-liner. If someone tries to change this in Task Scheduler they would be required to reenter the proper password.

powershell.exe -command if ([System.BitConverter]::ToString((New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider).ComputeHash([System.IO.File]::ReadAllBytes(‘C:\temp\test.ps1‘))) -eq ‘33-CD-2A-54-ED-F3-0F-94-5F-D2-97-D9-FE-4F-45-79‘) {. c:\temp\test.ps1} else {Send-MailMessage -SmtpServer smtp.server.domain.com -From whatever@domain.com -To you@domain.com -Subject ‘Failed to Run Script – Hash Not Correct’}

Notes about One Line Script Executor

  • You need to replace c:\temp\test.ps1 with the path to your script. (two places in this example)
  • You must supply the hash of the script. (use the following command to get it)

[System.BitConverter]::ToString((New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider).ComputeHash([System.IO.File]::ReadAllBytes(‘C:\temp\test.ps1‘)))

  • Script will email you if hash fails.
    • Change TO: and FROM: to match your needs.
  • Do NOT use double quotes in this script, do NOT forget that CMD will pass this to PowerShell, and will strip out double quotes.

If this helped you or perhaps you have suggestions to make it better, please do leave them in the comments.

Enjoy

-Eric

Unauthorized 401 when calling Coldfusion CFC Component WebService on IIS

If you just setup a fresh Coldfusion/IIS box and all of a sudden you check one of your CFC Component WebServices and get a 401 you are not alone!

I bet you went to the folder and triple checked IIS that Anonymous Authentication was enabled and everything else was disabled and yet still didnt work. Right about that time perhaps you start questioning everything you know in this world. I mean IIS is set to anonymous yet it’s telling you its not authenticating as if it were sent to Windows Authentication.

The Solution

Rest assured, you are not losing it. Simply you like me likely made the mistake of blanket turning on Windows Authentication at the root which in turned enabled it for the virtual folder:

/jakarta

CFC’s must pass back to this folder since they are processed server side. Anyways the easy solution is to set /jakarta folder to Anonymous Authentication.

Hey if this helped you or you know something I should add to make it better, please leave it in the comments!

-Eric

Connect to Azure SQL Database using ColdFusion 10/11/2016

My how the years fly and things change.

Even in 2017 I still find value in making quick enterprise applications in Coldfusion. However the world is a changing, many of my endpoint are beyond the boundaries of my corporate firewalls.

I have ended up with a ton of nodeJS webservices endpoints  running as docker containers in Azure jamming away data in Azure SQL. I want Coldfusion to be able to utilize that data.

The Solution

The solution is stupid easy… you can use the native Microsoft SQL Driver, no need to mess with anything else.

Go ahead put in the basics

  • Database: Name as shown in Azure
  • Server: something.database.windows.net
  • Port: 1433
  • Username: <sqlaccountname>@<databasename>
  • Password: <password>

Then for the secret sauce

  • Hit Show Advanced Settings
  • In the connection string put the following:

EncryptionMethod=SSL;Encrypt=yes;TrustServerCertificate=no;

Note: Encrypt=yes may not be needed but since its working I am not touching it.

And that’s it!

If this was helpful or have a way to make it better? Let me know in the comments.

-Eric