Wiedza
  • 0 Koszyk
  • Kontakt
  • Moje konto
  • Blog
  • MOC On-Demand – co to takiego?
  • MOC On-Demand – Co zyskujesz?
  • Kursy MS

VM Hangs – Azure, AWS, GCP – How to analyse why?

Get data

  1. Connect to Serial Console in Azure.
  2. type cmd.
  3. press escape – TAB – you should reach page when you can enter the username and password.
  4. try to authenticate and now you have access to the operating system, so you can do the analyses.
  5. if it is not possible, return to Serial Console (escape TAB).
  6. Type crashdump and press enter. This generate memory dump that can be analised.
  7. Restart machine and somehow download c:\windows\memory.dmp

Install Window Debuger . The best is to use Windows 11 for this.

Launch WinDBG and open the downloaded memory.dmp. You do not need to add symbols server – just in the latest version it will be picked automatically.

Execute the follwoing:

  1. !analyze -v – in this scenario it tells you nothing just you intentianally invoke the system crash. If System restart automatically it could help in analises (espacially for blue screens).
  2. !process 0 0 – show all processes.
  3. !process 0 7 – look for any with unusual CPU time or state.
  4. !locks – Shows kernel locks and potential deadlocks.
  5. !irql – Check interrupt request levels.
  6. !ready – Shows threads ready to run (may indicate scheduling issues).
  7. !vm – Virtual memory usage.
  8. !poolused – Kernel pool usage – look for pool exhaustion.
  9. !verifier – Driver verifier status.
  10. !devnode 0 1 – Device tree – look for failed devices.
  11. lm t n – List loaded modules/drivers with timestamps.
  12. !drvobj – Examine specific driver objects if you see suspicious ones.
  13. !thread – Examine the current thread context.
  14. !qlocks – Queue locks.
  15. !wdfkd.wdflogdump – Windows Driver Framework logs (if applicable.
  16. !poolused 2 – This shows pool usage sorted by size (largest first) – helps identify the biggest memory consumers.
  17. !deadlock – Looks for deadlock detection information.

You can put the results for any AI tool for the analyses.

My last analyses ends with message:

******* 266688 kernel stack PTE allocations have failed ******
******* 365226304 kernel stack growth attempts have failed ******

Azure Extended Network – The same address space (subnet) On-Prem and Azure

During migration to Azure sometime we need to have the same addreses (subnet) in Azure and On-Premise, the natural answer it is impossible… but it is not true! You can use Extend your on-premises subnets into Azure using extended network for Azure.

 

Problem is that the extension to Azure Admin Center is not available now, but you can download it from here or msft.sme.subnet-stretch.2.15.0. Than you must unzip it, put it to the disk and add it as a feed and install extension.

 

Microsoft Build 2025 opening keynote by links

Microsoft Build 2025 opening keynote just finished – here are the links presented by Satya Nadella:

Link Summary
https://aka.ms/AgenticDevOps An exploration of how AI agents are transforming DevOps practices through automated workflows and intelligent monitoring systems.
https://aka.ms/M365CopilotUpdates Latest feature updates for Microsoft 365 Copilot designed to enhance productivity across Word, Excel, PowerPoint, and other Microsoft applications.
https://aka.ms/TeamsAILibrary A comprehensive library of AI capabilities for Microsoft Teams that developers can leverage to create custom intelligent collaboration features.
https://aka.ms/ThirdPartyAgents Documentation for integrating third-party AI agents with Microsoft’s ecosystem, enabling expanded functionality and specialized capabilities.
https://aka.ms/CopilotTuning Guidelines and tools for fine-tuning Microsoft Copilot to better align with specific organizational needs and terminology.
https://aka.ms/FoundryStories Case studies highlighting successful implementations of Microsoft AI Foundry across various industries and use cases.
https://aka.ms/Aisin A partnership showcase between Microsoft and Aisin leveraging AI to revolutionize automotive manufacturing processes.
https://aka.ms/ModelRouter An intelligent system that dynamically routes AI requests to the most appropriate model based on task requirements and performance metrics.
https://aka.ms/FoundryGrok Microsoft’s integration documentation for connecting Anthropic’s Claude models within the AI Foundry environment.
https://aka.ms/BuildGrok Developer resources for building applications with Grok AI capabilities on Microsoft’s cloud infrastructure.
https://aka.ms/FoundryModels A catalog of pre-trained AI models available through Microsoft AI Foundry for various business and technical applications.
https://aka.ms/FoundryHuggingFace Microsoft’s implementation guide for deploying and scaling Hugging Face models within the AI Foundry platform.
https://aka.ms/FoundryAgentService A managed service that simplifies the deployment and orchestration of AI agents within Microsoft’s cloud ecosystem.
https://aka.ms/AIAppPlatform Microsoft’s comprehensive platform for building, testing, and deploying AI-powered applications with integrated development tools.
https://aka.ms/FoundryCopilotStudio A specialized environment for creating custom Copilot experiences tailored to specific business domains and workflows.
https://aka.ms/Stanford Microsoft’s research collaboration with Stanford University focusing on advancing responsible AI and machine learning technologies.
https://aka.ms/HealthcareAgentArchestrator An orchestration system designed specifically for managing AI agents in healthcare settings with appropriate security and compliance features.
https://aka.ms/FoundryObservability Monitoring and diagnostic tools for tracking AI model performance and behavior within the Microsoft AI Foundry environment.
https://aka.ms/EntraAgentID Microsoft Entra’s identity management solution specialized for authenticating and authorizing AI agents within enterprise systems.
https://aka.ms/SecurityForAI Comprehensive security guidelines and best practices for implementing AI systems while protecting sensitive data and preventing misuse.
https://aka.ms/FoundryLocal Tools for deploying Microsoft AI Foundry capabilities in local environments without requiring cloud connectivity.
https://aka.ms/WindowsAIFoundry Integration guide for connecting Windows operating system capabilities with Microsoft AI Foundry for enhanced desktop experiences.
https://aka.ms/WSLOpenSource Open-source projects and tools related to Windows Subsystem for Linux that support AI development workflows.
https://GitHub.com/Microsoft/NLWeb Microsoft’s open-source repository for natural language processing technologies optimized for web applications.
https://aka.ms/DataStories A collection of real-world examples demonstrating how organizations have leveraged data analytics and AI to solve business challenges.
https://aka.ms/NFL Microsoft’s partnership with the National Football League showcasing AI applications for sports analytics and fan engagement.
https://aka.ms/FoundryCosmosDB Integration documentation for using Cosmos DB as a data store for AI applications built on the Microsoft AI Foundry platform.
https://aka.ms/FoundryDatabricks Guidelines for incorporating Databricks analytics capabilities within the Microsoft AI Foundry ecosystem.
https://aka.ms/PostgreSQLGenAI Tools and patterns for implementing generative AI features with PostgreSQL databases on Microsoft Azure.
https://aka.ms/FabricCosmosDB Resources for using Cosmos DB within Microsoft Fabric to power data-intensive AI workloads and applications.
https://aka.ms/FabricDigitalTwin Implementation guide for creating digital twin solutions using Microsoft Fabric and AI capabilities for predictive modeling.
https://aka.ms/AIReadyOneLake Documentation for Microsoft’s data lake solution optimized for large-scale AI training and inferencing workloads.
https://aka.ms/FabricChatWithYourData Microsoft Fabric’s functionality that enables conversational interfaces to interact with enterprise data using natural language.
https://aka.ms/BuildNVIDIA Resources for leveraging NVIDIA hardware acceleration when building AI solutions on Microsoft’s cloud platform.
https://aka.ms/MetOfficeUK Case study of how the UK Met Office uses Microsoft AI technologies to improve weather forecasting and climate modeling.
https://aka.ms/Science Microsoft’s scientific computing resources and research initiatives advancing the intersection of AI and multiple scientific disciplines.
https://aka.ms/ScienceStories Success stories highlighting how Microsoft AI has enabled scientific breakthroughs across various research domains.
https://aka.ms/BuildWithAI Comprehensive guide for developers on integrating Microsoft’s AI services into custom applications and workflows.
https://aka.ms/CopilotEducation Educational resources and curriculum materials for teaching students how to effectively use Microsoft Copilot for learning and research.

SharePoint Sites Templates – from the battlefield

# Install Required Modules

Install-Module PnP.PowerShell -RequiredVersion 1.12.0 -Force
Install-Module PnP.PowerShell

# Register AppID
Register-PnPEntraIDAppForInteractiveLogin -ApplicationName “PnP PowerShell” -SharePointDelegatePermissions “AllSites.FullControl” -Tenant xxx.onmicrosoft.com

 

# Connect to the SharePoint Online

$adminSiteUrl = “https://xxx-admin.sharepoint.com/”
$siteUrl = “https://xxx.sharepoint.com/sites/Test01”

Connect-PnPOnline $adminSiteUrl -Interactive -ClientId xxx

# Test connection (display all SharePoint Online Sites)
Get-PnPTenantSite

 

# Get the json template from example site that we want to create template.

Get-PnPSiteScriptFromWeb -Url $siteUrl -IncludeAll > template.json

 

# Create template

$siteScriptFile = $PSScriptRoot + “.\template.json”
$webTemplate = “64” #64 = Team Site, 68 = Communication Site, 1 = Groupless Team Site
$siteScriptTitle = “Team01 Team Site Script”
$siteDesignTitle = “Team01 Team Site Template”
$siteDesignDescription = “Custom team site template with multi-colored theme, external sharing disabled and some cool stuff via Power Automate.”

$designPackageId = “6142d2a0-63a5-4ba0-aede-d9fefca2c767” # The default site template to use as a base when creating a communication site, more info later.

$siteScript = (Get-Content $siteScriptFile -Raw | Add-PnPSiteScript -Title $siteScriptTitle) | Select -First 1 Id

Add-PnPSiteDesign -Title $siteDesignTitle -SiteScript $siteScript.Id -WebTemplate $webTemplate -Description $siteDesignDescription -DesignPackageId $designPackageId

# Now you can apply template using GUI

# Display information about scripts and designs

Get-PnPSiteDesign
Get-PnPSiteScript

# Delete the template

Remove-PnPSiteDesign -Identity xxx
Remove-PnPSiteScript -Identity xxx

 

# Export Home.aspx design

Connect-PnPOnline $SiteUrl -Interactive -ClientId xxx
Export-PnPPage -Identity Home.aspx -Configuration -Out home.xml

# Apply Home.aspx design

$destUrl=”https://xxx.sharepoint.com/sites/Team04″
Connect-PnPOnline $destUrl -Interactive -ClientId xxx
Invoke-PnPSiteTemplate -Path .\home.xml

The above can be invoked as Azure Function.

A lot of Tips & Tricks can be found here.

Installing Azure Arc on Unsupported Machines 

 

Azure Arc enables you to manage and govern your servers, Kubernetes clusters, and applications across on-premises, multi-cloud, and edge environments. However, sometimes you might encounter unsupported machines. This guide will help you bypass these limitations and install Azure Arc on unsupported operating systems. 

1: Identify the Closest Supported Operating System 
First, identify the closest supported operating system for your machine. For example, if your machine runs an unsupported version of Linux, find the closest supported version, such as Ubuntu 24.04 for ARM. 

2: Execute Commands on the Supported Machine 
On the closest supported machine, execute the following commands to gather information about the operating system: 

cat /etc/os-release 

uname -m 

uname -s 

In my case, the commands return the following: 

  • uname -m: aarch64 or x86_64 
  • uname -s: Linux 
  • The output from `cat /etc/os-release` might look like this: 

PRETTY_NAME=”Ubuntu 24.04.1 LTS”
NAME=”Ubuntu”
VERSION_ID=”24.04″
VERSION=”24.04.1 LTS (Noble Numbat)”
VERSION_CODENAME=noble
ID=ubuntu
ID_LIKE=debian
HOME_URL=”https://www.ubuntu.com/”
SUPPORT_URL=”https://help.ubuntu.com/”
BUG_REPORT_URL=”https://bugs.launchpad.net/ubuntu/”
PRIVACY_POLICY_URL=”https://www.ubuntu.com/legal/terms-and-policies/privacy-policy”
UBUNTU_CODENAME=noble
LOGO=ubuntu-logo

 

 

3: Modify the Unsupported Operating System 

On your unsupported operating system, execute the following command to back up the original os-release file: 

cp /etc/os-release /etc/os-release.original 

Then, copy the content from the supported machine’s os-release file to the unsupported machine’s os-release file. 

4: Verify uname Commands 

Ensure that the `uname -m` and `uname -s` commands return the same values as on the supported machine. If they do not, you might need to create a shell script that overrides the original `uname` output. This topic will be covered in a different article. 

5: Execute the Script to Onboard the Machine 

Now, run the script to onboard your machine to Azure Arc. 

6: Revert the Changes 

After completing the onboarding process, revert the changes to the os-release file: 

rm /etc/os-release 

mv /etc/os-release.original /etc/os-release 

This ensures that your machine returns to its original state. 

By following these steps, you can successfully install Azure Arc on unsupported machines. Remember to always verify the outputs of the `uname` commands and revert any changes made to system files once the process is complete. 

 

Microsoft Windows 2003 Resource Kit – Run any exe as a service

Microsoft Windows 2003 Resource Kit can be download from web.archive.org here (version that was in 2020 year).

 

If you are looking for software that allows you to run any exe as service – check this (srvany-ng).

Windows and SSH

It was a some time ago, when native SSH server has been announced by Microsoft not third party. I mention it here:

 

SSH demon for Windows / Jak zainstalować demona SSH na Windows

Now it is time to update, you can install ssh server on Windows, even ARM edition from here:

https://github.com/PowerShell/Win32-OpenSSH/releases

 

Do not remember about open 22 port:

netsh advfirewall firewall add rule name=”Open SSH Port 22″ dir=in action=allow protocol=TCP localport=22 remoteip=any

If you would like to use build-in SSH, just execute:

Add-WindowsCapability -Online -Name OpenSSH.Client

Add-WindowsCapability -Online -Name OpenSSH.Server

All Windows Services connected with Azure ARC

To display all Windows Services connected with Azure ARC you can execute:

 

Get-WmiObject win32_service | Where-Object {$_.PathName -like “*AzureConnectedMachineAgent*”} | Select-Object Name, DisplayName, State, PathName | Format-Table -AutoSize

How to Deploy Application Functions to Existing Resources in Azure – Step by Step

Introduction 

Deploying application functions to existing resources in Azure can seem daunting, but with the right guidance, you can achieve this efficiently. This guide will walk you through each step, ensuring you can deploy your functions seamlessly. 

To begin with, you need to initialize your Azure Developer CLI (azd). This is crucial for setting up your environment and ensuring that all necessary configurations are in place. 

  1. Open your terminal or command prompt where your azd project files are located.
  2. Run the following command to initialize azd – azd init
  3. Verify and Modify Environment Values

If your application has already been deployed to other resources, it is essential to verify and potentially modify the environment values to ensure compatibility with the existing resources. 

  1. Run the command – azd env get-values 
  2. Review the values returned by the command and ensure they align with your current environment configuration. 
  3. If necessary, modify these values in the .env file to match the required settings. 

 

Update Tags for Resources

Before deploying your application, it is important to update the tags for the resources where the code will be deployed. These tags help identify and manage the application within your Azure environment. Typically, the tags to update are azd-env-name and azd-service-name.

To update the tags, navigate to your resource group in the Azure portal and modify the necessary tag values to correspond with your deployment settings. 

It is critical to ensure that the azd-env-name tag is updated for your Resource Group in the Azure portal. This tag is essential for correctly identifying the environment in which your application will be deployed.

 Authenticate to Azure

Before deploying the application, it is essential to authenticate to Azure. This ensures that you have the necessary permissions to proceed with the deployment process. 

Run one of the following commands to log in:

azd auth login

or, if you prefer to use a device

azd auth login –use-device-code

Finally, execute the following command to deploy the application without provisioning new resources:

azd deploy <service> 

Replace  with the name of your Azure Function service as defined in your azure.yaml file. This command will deploy only the application code to the specified service that refers by tag azd-service-name. 

Conclusion 

By following these steps, you should be able to deploy your application functions to the existing resources in Azure efficiently. Make sure to verify your environment values thoroughly and adjust them as needed to ensure a smooth deployment process. 

 

Install Free Lens – alliterative of famous Lens to manage the Kubernetes

Two steps to install at the end of article.

Lens is a great tool for managing and debugging Kubernetes Cluster and all aspects of deployments – https://k8slens.dev/. In the beginning, it was free, but life is life and nowadays it is paid especially for companies. But as it was free at the beginning and the source code was here, so similar to other projects like https://opentofu.org/ free Terraform, here is an alternative based on the latest version of the famous Lens.

 

The source is here: https://github.com/freelensapp/

 

You can also compile yourself steps, but for a faster way just download compiled one: https://github.com/freelensapp/freelens-nightly-builds/releases

 

After that, you can run it:

 

C:\Users\mf\AppData\Local\Programs\freelens\Freelens.exe

 

The main lack is that out of the box, there is no Node-Pod-Menu extension. Bu simply there is another project https://github.com/freelensapp/freelens-node-pod-menu for that. By the way, Piotr Roszczynski Polish is one of the contributors.

Usually, you can compile it yourself, but faster is just launch Freelens.exe press Control-Shift-E, and install the extension by entering @freelensapp/freelens-node-pod-menu and pressing Install.

Now you can enjoy the pod menu:

So two steps to using the Successor of Lens – FreeLens:

  1. Download https://github.com/freelensapp/freelens-nightly-builds/releases.
  2. Add extension (Control-SHift-E) and put: @freelensapp/freelens-node-pod-menu and press Install.

Copilot Studio – Installing and using ready to use agents

Conduct your own lab, download: Contoso Travel Policies

Azure AI – Speech To Text (STT) , Summary – Sample Solution

Repository: https://github.com/MariuszFerdyn/hands-on-lab-azure-functions-flex-openai

Step by step:

# Clone the repository
git clone https://github.com/MariuszFerdyn/hands-on-lab-azure-functions-flex-openai

# Login to Azure :
az login

# Display your account details
az account show

# Select your Azure subscription
az account set –subscription <subscription-id>

# Go to the project directory
cd <cloned-project-path>

# Authenticate using azd
azd auth login

# Create resources using the IaC defined in the infra directory
azd provision

# .azure/ignite.env

# Deploy Functions to Azure
azd env set AZURE_LOCATION eastus2 -e ignite2024mf –no-prompt
azd env refresh -e ignite2024mf

azd deploy

# Post wav file to ST via function

# Update AudioTranscriptionOrchestration01.cs

azd deploy processor

# Update AudioTranscriptionOrchestration02.cs

azd deploy processor

# Post wav file to ST via function

Tools:

  • https://dotnet.microsoft.com/en-us/download/dotnet/
  • https://code.visualstudio.com/
  • https://learn.microsoft.com/en-us/cli/azure/install-azure-cli-windows?tabs=azure-cli#install-or-update
  • winget install microsoft.azd
  • https://git-scm.com/downloads
  • https://learn.microsoft.com/en-us/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#installing-the-msi-package
  • REST Client – Visual Studio Marketplace

 

Copilot for SharePoint / Copilot for Documents in SharePoint Library

Introduction Copilot for SharePoint – especially Copilot for Documents in SharePoint Library.

Zero Trust – Step by Step Workshop

Successor of Zero Trust Model descibed https://rzetelnekursy.pl/zero-trust-model-audit-for-free has a new version https://aka.ms/ztworkshop.

It includes:

  1. Assesment
  2. Workshops
  3. Implementation RoadMap

 

 

Scripts to Report All Backups in Recovery Services Vaults

All VM in Backup:

# Import necessary modules
# Import-Module Az
# Connect-AzAccount

# Get all Recovery Services Vaults
$recoveryVaults = Get-AzRecoveryServicesVault

# Initialize an array to hold backup items
$backupItems = @()

# Enumerate all Recovery Services Vaults
foreach ($vault in $recoveryVaults) {
# Set the context to the current vault
Set-AzRecoveryServicesVaultContext -Vault $vault

# Get all backup containers in the current vault
$containers = Get-AzRecoveryServicesBackupContainer -ContainerType AzureVM

# Enumerate all backup containers
foreach ($container in $containers) {
# Get all backup items in the current container for the specified workload type
$items = Get-AzRecoveryServicesBackupItem -Container $container -WorkloadType AzureVM

# Add the backup items to the array, renaming the existing ContainerName property
$backupItems += $items | Select-Object @{Name=”VaultName”;Expression={$vault.Name}}, @{Name=”ResourceGroupName”;Expression={$vault.ResourceGroupName}}, @{Name=”BackupContainerName”;Expression={$container.Name}}, *
}
}

# Display the backup items in Out-GridView
$backupItems | Out-GridView
$backupItems | Export-CSV VMSpecSources.csv

 

All SQL Databases in Backup (included deleted sources databases):

# Import necessary modules
# Import-Module Az
# Connect-AzAccount
# Login to Azure account

# Get all Recovery Services Vaults
$recoveryVaults = Get-AzRecoveryServicesVault

# Initialize an array to hold backup items
$backupItems = @()

# Enumerate all Recovery Services Vaults
foreach ($vault in $recoveryVaults) {
# Set the context to the current vault
#Set-AzRecoveryServicesVaultContext -Vault $vault

# Get all backup containers in the current vault for MSSQL
$containers = Get-AzRecoveryServicesBackupContainer -ContainerType AzureVMAppContainer -VaultId $vault.ID
#echo “——— containers ——–”
#echo $containers
#echo “—————————–”
# Enumerate all backup containers
foreach ($container in $containers) {
# Get all backup items in the current container for MSSQL
Set-AzRecoveryServicesVaultContext -Vault $vault
#Get-AzRecoveryServicesBackupProtectableItem -ItemType “SQLDataBase”
$items = Get-AzRecoveryServicesBackupItem -Container $container -WorkloadType MSSQL -VaultId $vault.ID
Write-Host “——— items ——–”
Write-Host “vault :” -NoNewline; Write-Host $vault.Name
Write-Host “Container:” -NoNewline; Write-Host $container.Name
Write-Host “Items: :” -NoNewline; Write-Host $items.FriendlyName
Write-Host “—————————–”

# Add the backup items to the array, renaming the existing ContainerName property
$backupItems += $items | Select-Object `
@{Name=”VaultName”; Expression={$vault.Name}}, `
@{Name=”ResourceGroupName”; Expression={$vault.ResourceGroupName}}, `
@{Name=”BackupContainerName”; Expression={$container.Name}},FriendlyName,ServerName,ParentName,ParentType,LastBackupErrorDetail,ProtectedItemDataSourceId,ProtectedItemHealthStatus,ProtectionStatus,PolicyId,ProtectionState,LastBackupStatus,LastBackupTime,ProtectionPolicyName,ExtendedInfo,DateOfPurge,DeleteState,Name,Id,LatestRecoveryPoint,SourceResourceId,WorkloadType,ContainerName,ContainerType,BackupManagementType

}
}

# Display the backup items in Out-GridView
$backupItems | Out-GridView

# Export the backup items to a CSV file
$backupItems | Export-CSV -Path “MSSQLBackupItemsAll.csv” -NoTypeInformation

 

All SQL Databases in Backup (databases that still exist):

 

# Import necessary modules
# Import-Module Az
# Connect-AzAccount
# Login to Azure account

# Get all Recovery Services Vaults
$recoveryVaults = Get-AzRecoveryServicesVault

# Initialize an array to hold backup items
$backupItems = @()

# Enumerate all Recovery Services Vaults
foreach ($vault in $recoveryVaults) {
# Set the context to the current vault
#Set-AzRecoveryServicesVaultContext -Vault $vault

# Get all backup containers in the current vault for MSSQL
$containers = Get-AzRecoveryServicesBackupContainer -ContainerType AzureVMAppContainer -VaultId $vault.ID
#echo “——— containers ——–”
#echo $containers
#echo “—————————–”
# Enumerate all backup containers
foreach ($container in $containers) {
# Get all backup items in the current container for MSSQL
$items = Get-AzRecoveryServicesBackupProtectableItem -Container $container -WorkloadType MSSQL -ItemType “SQLDataBase” -VaultId $vault.ID
echo “——— items ——–”
echo “vault :”+$vault.Name
echo “Container:”+$container.Name
echo “Items: :”+$items.FriendlyName
echo “—————————–”

# Add the backup items to the array, renaming the existing ContainerName property
$backupItems += $items | Select-Object @{Name=”VaultName”;Expression={$vault.Name}}, @{Name=”ResourceGroupName”;Expression={$vault.ResourceGroupName}}, @{Name=”BackupContainerName”;Expression={$container.Name}}, `
FriendlyName, ProtectionState, ProtectableItemType, ParentName, ParentUniqueName, ServerName, `
IsAutoProtectable, IsAutoProtected, AutoProtectionPolicy, Subinquireditemcount, Subprotectableitemcount, `
Prebackupvalidation, NodesList, Name, Id, WorkloadType, ContainerName, ContainerType, BackupManagementType
}
}

# Display the backup items in Out-GridView
$backupItems | Out-GridView

# Export the backup items to a CSV file
$backupItems | Export-CSV -Path “MSSQLBackupItemsExisting.csv” -NoTypeInformation

Combine AppID from Azure Logs with the Application Name – How to

If we query for AppID from Log Analytics,  like:

MicrosoftGraphActivityLogs

| summarize NumberOfRequests=count() by AppId

| order by NumberOfRequests desc

we usually need to combine it with the Application name.

So we need to export all Enterprise Applications and App Registrations to csv from:

  • https://portal.azure.com/#view/Microsoft_AAD_IAM/StartboardApplicationsMenuBlade/~/AppAppsPreview/menuId~/null
  • https://portal.azure.com/#view/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/~/RegisteredApps

Do not forget about Managed Identities, you can do it by this query:

resources

| where type =~ ‘Microsoft.ManagedIdentity/userAssignedIdentities’

| project name, principalId = properties.principalId, clientId = properties.clientId

Make from all of them file AppIDList.csv like:

ApplicationName AppID
“VeeamM365B” d8dff9d3-367b-4967-8a2a-f2d31c929f5d
“P2P Server” 39ed2d41-3e76-4505-ae68-56c02cf713c9

 

We need to upload this file to a storage account so that it is publicly accessible.

And finally, we can make a query that combines AppID with the corresponding name, so we can execute:

let ApplicationInformation = externaldata (ApplicationName: string, AppId: string, Reference: string ) [h”https://xxxx.blob.core.windows.net/xxx-allapplicationslist/xxx.csv”] with (ignoreFirstRecord=true, format=”csv”);

MicrosoftGraphActivityLogs

| summarize NumberOfRequests=count() by AppId

| lookup kind=leftouter ApplicationInformation on $left.AppId == $right.AppId

| order by NumberOfRequests desc

| project AppId, ApplicationName, NumberOfRequests

So finally we got AppID and the Application Name.

 

Sample Event Driven application – Storage Account, Azure Functions, Computer Vision and CosmosDB

Sample Event Driven application – that uses Storage Account as a input than trigger Azure Function, uses Computer Vision and store information in CosmosDB. All with help of Event Grid.

 

Full source code: https://github.com/MariuszFerdyn/Build-and-deploy-serverless-apps-with-Azure-Functions-and-Azure-AI

Azure Form Recognizer / Document Intelligence Studio – Step By Step #MSBUILD

Complete solution for using Azure Form Recognizer / Document Intelligence Studio.

 

Source code used in this Lab:

https://github.com/MariuszFerdyn/AzureAI-Document-Intelligence-Studio—Form-Recognizer

Azure Devops Custom (Errors) Messages in emails and Job reports

One of the common feature request for Azure Devops is to have a custom messages in emails. It could be great feature, but currently we have some options – so let’s see them one by one.

The code:

– task: Bash@3
  inputs:
    targetType: ‘inline’
    script: |
      echo “##vso[task.logissue type=error]Hello world!”
      echo “##vso[task.complete result=Failed]”

produces the following report (we see the message), and email (we see the message):

The code:

– task: Bash@3
  inputs:
    targetType: ‘inline’
    script: |
      echo “##vso[task.logissue type=warning]Hello world!”
      echo “##vso[task.complete result=Succeeded]”

produces the following report (we see the message), and email (we do not see the message):

The code:

– task: Bash@3
  inputs:
    targetType: ‘inline’
    script: |
      echo “##vso[task.logissue type=error]01Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.logissue type=error]02Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.logissue type=error]03Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.logissue type=error]04Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.logissue type=error]05Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.logissue type=error]06Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.logissue type=error]07Beginning of a group…Warning message…Error messaage…Start of a section…Debug text…Command-line being run!”
      echo “##vso[task.complete result=Succeeded]”

produces the following report (we see the message), and email (we see the message):

Build your own Copilots with Microsoft Copilot Studio – Step By Step LAB from #MSBUILD 2024

Learn how you can build your own copilots with Microsoft Copilot Studio. In this workshop you’ll learn how Copilots can be created for use across the business. You’ll also see how you can create custom plug ins that can integrate with custom solutions. We’ll then show you how you can use Generative AI for even more intelligent responses.

The source code used in example:

https://github.com/MariuszFerdyn/Build-your-own-Copilots-with-Microsoft-Copilot-Studio

Intune Script that can create new local admin user – helpful if LAPS fails

Remediation script:

# Define the new user’s username and password
$newUsername = “mfmfmf”
$newPassword = ConvertTo-SecureString “xxxx” -AsPlainText -Force

# Create the new local user
New-LocalUser -Name $newUsername -Password $newPassword -FullName “New User” -Description “This is a new user account.”

# Optionally, add the user to a group (e.g., Administrators)
Add-LocalGroupMember -Group “Administrators” -Member $newUsername

# Output a success message
Write-Output “User $newUsername has been created successfully.”

Via Intune fix – The trust relationship between this workstation and the primary domain failed

On machine where you see trust relationship is broken, log in using last credentials, but without network. In this way it should be possible. We are saving them to avoid storing AD credentials in Intune.

  • Save locally the variables with permissions to reset password using this script:

$adminUsername=”xxxx\adjoinuser”
$adminPassword=”xxx”
#$cred = New-Object PSCredential $adminUsername, ($adminPassword | ConvertTo-SecureString -AsPlainText -Force)
New-Item -ItemType Directory c:\aaaa
Get-Variable admin* | Export-Clixml c:\aaaa\vars.xml
#Import-Clixml c:\aaaa\vars.xml | %{ Set-Variable $_.Name $_.Value }

  • Create Intune script:

    • Detection script:
    • exit 1

    • Remediation script:
    • Import-Clixml c:\aaaa\vars.xml | %{ Set-Variable $_.Name $_.Value }
      #$adminUsername
      #$adminPassword
      $cred = New-Object PSCredential $adminUsername, ($adminPassword | ConvertTo-SecureString -AsPlainText -Force)
      Test-ComputerSecureChannel -Repair -Credential $cred

 

  • Assigned to created group with machine name and/or with affected username
  • After executing the script via Intune trust relationship should be fixed

 

You can also do this in this way, all in Intune Script but password will be stored in Intune, but without any access to affected machine

  • Remediation script:

 

$adminUsername=”xxxx\adjoinuser”
$adminPassword=”xxx”
$cred = New-Object PSCredential $adminUsername, ($adminPassword | ConvertTo-SecureString -AsPlainText -Force)
Test-ComputerSecureChannel -Repair -Credential $cred

 

Azure DevOps Export Variables Group – GUI

As you probably notice there is no GUI to export the Variables Groups, but there is a very nice API that can be called directly from your browser. So simply call

https://dev.azure.com/{organization}/{project}/_apis/distributedtask/variablegroups?api-version=5.0-preview.1

like:

https://dev.azure.com/xxx/AppGeatewyBicep/_apis/distributedtask/variablegroups?api-version=5.0-preview.1

It display all the Variables Groups in Azure DevOps like:


To display one Variable Group and Export it use this:

https://dev.azure.com/{organization}/{project}/_apis/distributedtask/variablegroups/{groupId}?api-version=5.0-preview.1

like

https://dev.azure.com/xxx/AppGeatewyBicep/_apis/distributedtask/variablegroups/4?api-version=5.0-preview.1

 

More info here.

NSG Flow Logs / VNet Flow Logs

VNet Flow Logs is a successor of NSG Flow Logs that works not in NSG context, but inside VNETs whats give us a better view. If you put consolidated logs to the Log Analytics Workspace there are some advantages also:

 

NSG Flow logs goes to AzureNetworkAnalytics_CL table that can not be exported, so can not be a part of Event Hub solution.

VNet Flow logs goes to NTANetAnalytics table, and this table can be exported to Event Hub solution.

 

Build a Music Recommendation System with Azure Container Apps and AI – Lab from MS Build 2024

We’re building a music recommendation service where users will be able to search and select from a set of songs, and the system will recommend similar songs to them. Below is a depiction of the architecture:

The application is composed of four different components:

  • A Azure Container Apps (ACA) Jupyter Environment which teaches about and produces embeddings for our library of 11,000 songs.
  • A Qdrant ACA Add-On Vector DB which stores embeddings (think of them as fingerprints) and produces our recommendations based on them.
  • A ACA API app which brokers the data between the frontend UI and the vector database.
  • A ACA Frontend app which provides the user-facing UI to interact with the recommendation service.

The overall intention of this application is for the user to learn about vector databases. Hence the process of deploying this application is broken up into two parts.

In part one we play the role of a data scientist or ML engineer. We will familiarize ourselves with the process of generating embeddings for our song data. This part completes when we’ve stored our embeddings in our vector database.

In part two we play the role of an application engineer and turn the stored embeddings data into a recommendation service by adding a API and frontend.

Step by Step Deployment:

az login

az provider register -n Microsoft.OperationalInsights –wait &&
az provider register -n Microsoft.ServiceLinker –wait &&
az provider register -n Microsoft.App –wait

export LOCATION=westus2
export RG=music-rec-service
export ACA_ENV=music-env
export NOTEBOOK_IMAGE=mafamafa/aca-music-recommendation-notebook
export BACKEND_IMAGE=mafamafa/aca-music-recommendation-backend
export FRONTEND_IMAGE=mafamafa/aca-music-recommendation-frontend

# create the resource group
az group create -l $LOCATION –name $RG

az containerapp env create –name $ACA_ENV –resource-group $RG –location $LOCATION –enable-workload-profiles

## Create the vector db add-on
az containerapp add-on qdrant create –environment $ACA_ENV –resource-group $RG –name qdrant

# add a workload profile for the large Jupyter image
az containerapp env workload-profile add –name $ACA_ENV –resource-group $RG –workload-profile-type D8 –workload-profile-name bigProfile –min-nodes 1 –max-nodes 1

az containerapp create –name music-jupyter –resource-group $RG –environment $ACA_ENV –image $NOTEBOOK_IMAGE –cpu 4 –memory 16.0Gi –workload-profile-name bigProfile –min-replicas 1 –max-replicas 1 –target-port 8888 –ingress external –bind qdrant

az containerapp logs show -g $RG -n music-jupyter | grep token

####Open in Portal music-jupyter url and put Token

####Start.ipnyb

####Import.ipnyb

# launch the backend application
az containerapp create –name music-backend –resource-group $RG –environment $ACA_ENV –image $BACKEND_IMAGE –cpu 4 –memory 8.0Gi –workload-profile-name bigProfile –min-replicas 1 –max-replicas 1 –target-port 8000 –ingress external –bind qdrant

####http://<YOUR_ACA_ASSIGNED_DOMAIN>/songs

az containerapp create –name music-frontend –resource-group $RG –environment $ACA_ENV –image $FRONTEND_IMAGE –cpu 2 –memory 4.0Gi –min-replicas 1 –max-replicas 1 –ingress external –target-port 8080 –env-vars UI_BACKEND=https://music-backend.<YOUR_UNIQUE_ID>.westus2.azurecontainerapps.io

 

GPU:

# create the environment first
az containerapp env create –name $ACA_ENV –resource-group $RG –location $LOCATION –enable-workload-profiles –enable-dedicated-gpu

az containerapp create –name music-jupyter –resource-group $RG –environment $ACA_ENV –image mafamafa/aca-music-recommendation-notebook:gpu –cpu 24 –memory 48.0Gi –workload-profile-name gpu –min-replicas 1 –max-replicas 1 –target-port 8888 –ingress external –bind qdrant

Copilot in Word – Real Example

Microsoft Build 2024 – Sorted News not connected with AI/Copilot and Connected with AI/Copilot

Complete Microsoft Build 2024 Book of News is here.

According to the document: Here is a list of features not connected with Copilot or AI:

  • Khan Academy and Microsoft Announce Partnership
  • Speech Analytics, Video Dubbing in Preview in Azure AI Speech
  • Introducing Real-Time Intelligence in Microsoft Fabric
  • New Capabilities and Updates in Microsoft Fabric
  • New Capabilities in Azure Cosmos DB
  • Snowflake Apache Iceberg Shortcuts in Fabric
  • Azure Compute Fleet Now in Preview
  • Azure Migrate and Azure Container Storage Updates
  • New Azure Virtual Machine Series
  • Azure App Service Boosts Performance and Security for Web App Creation
  • Azure Container Apps Launches Dynamic Sessions
  • Azure Functions Launches Flex Consumption Plan, Extensions
  • Azure Kubernetes Service Automatic Makes Kubernetes Adoption Easy
  • New Azure Event Grid Capabilities Support IoT Solutions, Event Sources
  • New Enhancements and Integrations in Azure Load Testing
  • Spring Batch Support for Azure Spring Apps Enterprise in Preview
  • Updates to Azure Logic Apps
  • Visual Studio Code for Education Now Generally Available
  • Microsoft Edge for Business Boosts Defenses Against Data Leaks, Vulnerabilities
  • Real-Time Video Translation in Microsoft Edge Coming Soon
  • Fluid Framework 2.0 Now in Preview
  • New Enhancements for Custom App Experiences Connected to Microsoft Teams

According to the document: Here is a list of AI and Copilot features:

  • Azure Patterns and Practices for Private Chatbots
  • Custom Generative Mode
  • Azure AI Search Features Search Relevance Updates and New Integrations
  • Azure AI Studio Lets Developers Responsibly Build and Deploy Custom Copilots
  • Azure OpenAI Service Features Key AI Advancements
  • Khan Academy and Microsoft Announce Partnership
  • Microsoft Adds Multimodal Phi-3 Model Phi-3-Vision
  • Safeguard Copilots with New Azure AI Content Safety Capabilities
  • Speech Analytics, Video Dubbing in Preview in Azure AI Speech
  • Introducing Real-Time Intelligence in Microsoft Fabric
  • New AI Capabilities in Azure Database for PostgreSQL
  • New Capabilities and Updates in Microsoft Fabric
  • New Capabilities in Azure Cosmos DB
  • Microsoft for Startups Founders Hub Gains AI Capabilities
  • New AI Features for Microsoft Learn Now Available
  • Azure API Center and Generative AI Capabilities in Azure API Management Now Available
  • Azure App Service Boosts Performance and Security for Web App Creation
  • Azure Container Apps Launches Dynamic Sessions
  • Azure Functions Launches Flex Consumption Plan, Extensions
  • Azure Kubernetes Service Automatic Makes Kubernetes Adoption Easy
  • Azure Service Bus Updates Now in Preview
  • Azure Static Web Apps Features Dedicated Pricing Plan
  • Dev Box Adds Ready-to-Code, Enterprise Management Features
  • Expanding Extensibility Model to Pulumi in Azure Deployment Environments
  • Introducing GitHub Copilot Extensions, Featuring GitHub Copilot for Azure
  • New Azure Event Grid Capabilities Support IoT Solutions, Event Sources
  • New Enhancements and Integrations in Azure Load Testing
  • Spring Batch Support for Azure Spring Apps Enterprise in Preview
  • Updates to Azure Logic Apps
  • Visual Studio 17.10 Now Integrates GitHub Copilot
  • Visual Studio Code for Education Now Generally Available
  • Microsoft Copilot Capabilities in Azure
  • Microsoft Copilot in Azure Preview Open to All Customers
  • Copilot Studio Powering Next Wave of Copilot Experiences
  • Power Automate Updates Feature AI and Process Automation
  • AI Extensibility for Mesh in Preview
  • Fluid Framework 2.0 Now in Preview
  • New AI-Powered Features and Enhanced Data Protection in Microsoft Teams Premium
  • New Enhancements for Custom App Experiences Connected to Microsoft Teams
  • New Features in Microsoft Teams and Loop Help Teams Collaborate More Effectively
  • Microsoft Edge for Business Boosts Defenses Against Data Leaks, Vulnerabilities
  • Real-Time Video Translation in Microsoft Edge Coming Soon
  • Azure Patterns and Practices for Private Chatbots
  • Azure AI Search Features Search Relevance Updates and New Integrations
  • Azure AI Studio Lets Developers Responsibly Build and Deploy Custom Copilots
  • Khan Academy and Microsoft Announce Partnership
  • Azure AI Content Safety Capabilities
  • Speech Analytics, Video Dubbing in Preview in Azure AI Speech
  • New AI Capabilities in Azure Database for PostgreSQL
  • New Azure Virtual Machine Series Optimized for AI and Cloud-Native Workloads
  • Microsoft for Startups Founders Hub Gains AI Capabilities
  • New AI Features for Microsoft Learn Now Available
  • Azure API Center and Generative AI Capabilities in Azure API Management Now Available
  • Visual Studio 17.10 Now Integrates GitHub Copilot
  • Real-Time Video Translation in Microsoft Edge Coming Soon
  • AI Extensibility for Mesh in Preview
  • New AI-Powered Features and Enhanced Data Protection in Microsoft Teams Premium

 

Both list were generated by Copilot… So the Copilot/AI everywhere!

 

Microsoft Project in fact Mark Russinovich’s top of mind – You must use

Mark Russinovich was founder of SysInternals company with these tools like PsExec, Sysmon and other commercial tools that companies bought and used debugging Windows. In 2000 year almost every enterprise used it. Nowadays’ Microsoft Azure CTO (Chief Technology Officer).

Today’s Mark Russinovich’s top of mind projects are:

  • KEDA (https://keda.sh/) – event based auto-scaler for Kubernetes, e.g. allows run Azure Functions in Azure Kubernetes Cluster.
  • Dapr (https://dapr.io/) – the integrated API with the underlying resources. For instance, when you’re using the Dapr publish subscribe API, you can change the message broker by swapping out a yaml component file to switch from RabbitMQ, to Kafka (or any other supported broker), without changing your application code.
  • Copa (https://project-copacetic.github.io/copacetic/website/) – Patching Container Images. open-source image vulnerability patching tool. Copa is designed for the security of the container images.
  • Radius (https://github.com/radius-project/radius) – supports deploying applications across private cloud, Microsoft Azure, and Amazon Web Services.

Windows Server 2025 What’s New in Active Directory

Windows Server 2025 What’s New in Active Directory and not only:

  • RTM in April 2024 (Microsoft Build Conference?)
  • Domain/Forest level 2025
  • 32K instead 8K page size (speed)
  • NUMA Support
  • Replication Priority
  • Kerberos AES-SHA2
  • Deprecating NTLM
  • IAKerb (Local KDC)
  • DC Locator Improvements
  • LDAP improvements e.g. confidential attributes
  • SMB auth limiter
  • SMB Signing required
  • SMB over QUIC
  • SMB alternate ports
  • SMB mandate encryption

 

See all of them on YouTube (Lets use Copilot to do recap):

  • http://aka.ms/ADTT32kPagesDemo
  • http://aka.ms/ADTTNumaDemo
  • http://aka.ms/ADTTDCLocatorDemo
  • http://aka.ms/ADTTDmsaDemo
  • http://aka.ms/ADTTLsalookupDemo
  • http://aka.ms/ADTTDCLocPerfDemo
  • http://aka.ms/ADTTLdapPerfDemo
  • https://techcommunity.microsoft.com/t5/storage-at-microsoft/smb-signing-required-by-default-in-windows-insider/ba-p/3831704
  • https://techcommunity.microsoft.com/t5/storage-at-microsoft/smb-alternative-ports-now-supported-in-windows-insider/ba-p/3974509

 

Windows Server 2025 – try it now:

https://www.microsoft.com/en-us/software-download/windowsinsiderpreviewserver

Defender for Endpoint / Defender for Servers – Linux – Real-time protection is turned off by default – check if your Linux workload are safe #1

It was a total surprise for me… without any warning… just info from: https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/linux-whatsnew?view=o365-worldwide

July-2023 Build: 101.23062.0010 | Release version: 30.123062.0010.0

 

  • Other fixes and improvements
    • From this version, enforcementLevel are in passive mode by default giving admins more control over where they want ‘RTP on’ within their estate
    • This change only applies to fresh MDE deployments, for example, servers where Defender for Endpoint is being deployed for the first time. In update scenarios, servers that have Defender for Endpoint deployed with RTP ON, continue operating with RTP ON even post update to version 101.23062.0010

 

 

July-2023 Build: 101.23062.0010 | Release version: 30.123062.0010.0

 

Available in Defender for Endpoint version 101.10.72 or higher. Default is changed from real_time to passive for Endpoint version 101.23062.0001 or higher.

 

  • Passive (passive): Runs the antivirus engine in passive mode. In this:
    • Real-time protection is turned off: Threats are not remediated by Microsoft Defender Antivirus.
    • On-demand scanning is turned on: Still use the scan capabilities on the endpoint.
    • Automatic threat remediation is turned off: No files will be moved and security admin is expected to take required action.
    • Security intelligence updates are turned on: Alerts will be available on security admins tenant.

Interesting fix and improvement… it means that eventually the attacker will be not blocked… So if you deploy MDE after July please check your settings. The good question is how… Microsoft is talking about Ansible, Puppet, and Chef for managing the defender.

 

The other option can be Azure RunCommand (via Azure Devops / Azure Automation / Auzre Function):

$vm=”$(VM)”
write-host $vm
Invoke-AzVmRunCommand -ResourceGroupName “$(ResourceGroupName)” -VMName $vm -CommandId “RunPowerShellScript” -ScriptPath “$(System.DefaultWorkingDirectory)\_project\scripts\xxx.ps1”

Or the future is manage at scale by using Azure Policy Guest Configuration.

https://cloudbrothers.info/en/azure-persistence-azure-policy-guest-configuration/

Unfortunately, I was not able to find ready to use Policy. So we need to write our own (How to install the machine configuration authoring module – Azure Automanage | Microsoft Learn), what can be not so easy, but stay tuned…

 

1 2 3 4 5 >»
Projekt i wykonanie: Mobiconnect i fast-sms.net   |    Regulamin
Ta strona korzysta z ciasteczek aby świadczyć usługi na najwyższym poziomie. Dalsze korzystanie ze strony oznacza, że zgadzasz się na ich użycie.Zgoda

Added to Cart

Keep Shopping