Wiedza
  • 0 Koszyk
  • Kontakt
  • Moje konto
  • Blog
  • MOC On-Demand – co to takiego?
  • MOC On-Demand – Co zyskujesz?
  • Kursy MS

Brand Kits in Microsoft 365 Copilot App

🔹 Feature: Brand Kits in Microsoft 365 Copilot App  

🔹 What It Does: Provides one centralized place to manage all your brand assets—logos, color palettes, custom fonts, templates, styles, brand guidelines, and voice—ensuring consistency across your organization.

What Is It Giving You:

✅ Consistent Branding: Keep every document, slide deck, and communication aligned with your organization’s branding effortlessly.  

✅ Centralized Asset Management: Manage logos, templates, and other brand components in one secure location.  

✅ Faster Content Creation: Copilot automatically applies your brand kits when creating materials in Word, PowerPoint, Outlook, and more.  

✅ Controlled Access: Define exactly who can view, edit, or manage brand kits to maintain quality and compliance.  

✅ Scalable for Any Team: Designed for enterprises, internal teams, and personal use to maintain stylistic alignment.

Types of Brand Kits

  • Official: Managed by brand managers; accessible to everyone in the tenant.  
  • Sharede: Available to a select group of people; access controlled at the time of sharing.  
  • Personal: Available only to the creator of the brand kit.

🌐 https://support.microsoft.com/en-us/topic/create-and-manage-official-brand-kits-in-microsoft-365-copilot-app-6bc8a5a7-5697-466b-9e1f-302a38d44afc

Security Copilot is Now Auto-Provisioned for Microsoft 365 E5 – but only Security Copilot

🔹 Feature: Security Copilot is Now Auto-Provisioned for Microsoft 365 E5

🔹 What It Does: Ignite news was that copilot will be included in E5 license, but the Security Copilot not productivity like Word, Excel, Studio

What It Gives You:

✅ Security operations in Microsoft Defender: Simplify phishing triage and incident response with agents that classify alerts, summarize threats, and guide remediation.

✅ Data security in Microsoft Purview: Strengthen data protection and compliance with agents that assist in triaging alerts and discovering sensitive data.

✅ Identity and access in Microsoft Entra: Enhance access controls and reduce risk using agents that optimize Conditional Access policies and automate access reviews.

✅ Endpoint management in Microsoft Intune: Maintain secure, compliant endpoints more efficiently with agents that help assess changes before they impact productivity.

✅ Custom Use Cases: Build custom Security Copilot agents tailored to your organization’s unique workflows for maximum flexibility and efficiency.

More info:

🌐 https://learn.microsoft.com/en-us/copilot/security/security-copilot-inclusion

Azure Managed Lustre

🔹 Feature: Azure Managed Lustre

🔹 What It Does: A high-performance distributed parallel file system designed for large-scale computing environments, capable of handling petabytes with high throughput.

What Is It Giving You:

✅ Managed PaaS Experience: Enjoy a fully managed platform-as-a-service (PaaS) solution for hassle-free deployment and management of Lustre.

✅ Scalable Throughput Options: Choose from various throughput levels at different price points to match diverse workloads while maintaining top performance.

✅ Region Flexibility: Deploy across multiple Azure regions with compliance and data residency options tailored to your needs.

✅ Scalability: Scale effortlessly to hundreds of petabytes (PB), leveraging durability, availability, and cost benefits with Azure Blob storage integration.

✅ Integration with Azure Services: Seamlessly connects with Azure HPC services, Azure Machine Learning, and Azure Kubernetes Service for a unified HPC and AI ecosystem.

🌐 https://learn.microsoft.com/en-us/azure/azure-managed-lustre/amlfs-overview

 

Sensitivity Labels in OneNote

🔹 Feature: Sensitivity Labels in OneNote

🔹 What It Does: OneNote files are now protected and governed just like in other Office apps, including encryption, ensuring your notes stay secure.

 

What is it giving you:

 

✅ Enhanced Security: Apply sensitivity labels directly in OneNote to control access and protect data.

✅ Consistent Governance: Enjoy the same compliance and governance tools available in other Office apps.

✅ Data Encryption: Keep your notes encrypted for added security, both at rest and in transit.

✅ Ease of Use: Seamless integration with existing Microsoft 365 security policies.

 

🌐 https://www.microsoft.com/insidetrack/blog/confidential-by-design-how-were-securing-onenote-for-the-age-of-ai-at-microsoft/

 

#MSIgnite 2025 – Latest News in Azure Compute

🔹 Feature: The Latest News in Azure Compute

Now fresh updates from #MSIgnite conferences or just before! 🚀

✅ Azure Boost – Up to 20 GB/s Remote Storage Throughput & 1M IOPS

Azure Boost enhances performance by offloading virtualization tasks like networking, storage, and host management to specialized hardware/software. This boosts CPU efficiency for workloads, delivering up to 20 GB/s remote storage throughput and 1M IOPS for remote storage scenarios.

🌐 https://learn.microsoft.com/en-us/azure/azure-boost/overview

✅ Das / Eas / Fasv7 VMs – 5th Gen AMD EPYC™ (Turin / EPYC 9005)

Targeting varied needs: Dasv7 (general purpose), Easv7 (memory-heavy workloads), and Fasv7 (CPU-heavy workloads). Built on AMD EPYC 9005 (“Turin”), these VMs are ideal for diverse enterprise demands.

✅ Ds / Esv7 VMs – Intel® Xeon® 6 (Granite Rapids)

Introducing Dsv7 (general purpose) and Esv7 (memory-optimized) VMs in preview, powered by Intel Xeon 6. Configurations include options with/without local NVMe temp disks and diverse memory-to-vCPU ratios.

✅ Dnsv6 / Ensv6 VMs – Network Optimized

Engineered for network-heavy workloads, these VMs utilize hardware acceleration for enhanced networking performance and faster connection setups, ideal for appliances and high connection-rate services.

✅ Ebsv6 VMs – Remote Storage Optimized

Designed for memory-intensive workloads with demanding remote disk performance. Features include up to 800,000 IOPS and 14,000 MBps remote disk throughput—perfect for databases and analytics.

✅ Confidential VMs – 5th Gen Intel® Xeon® with Intel® TDX

Focused on securing data in use through hardware-enforced isolation. Explore next-gen Intel TDX confidential VMs and DCesv6 specs for heightened security.

✅ New Ultra Disk Capabilities

Azure Ultra Disk advances with features like Instant Access Snapshot, enhancing the snapshot/restore experience for mission-critical, latency-sensitive I/O operations.

🌐 https://azure.microsoft.com/en-us/blog/the-new-era-of-azure-ultra-disk-experience-the-next-generation-of-mission-critical-block-storage/

 

#mvpbuzz #azurenews #Microsoft365 #mctbuzz #msignite

Azure Front Door CAPTCHA

🔹 Feature: Azure Front Door CAPTCHA

🔹 What It Does: Azure Front Door CAPTCHA adds an extra layer of security to your web applications by validating users through interactive challenges, helping prevent automated attacks and bots.

What is it giving you:

✅ Enhances security with CAPTCHA challenges to block malicious bots

✅ Reduces risks of automated attacks like credential stuffing and DDoS

✅ Seamless integration with Azure Front Door’s Web Application Firewall (WAF)

✅ Customizable challenge settings to fit specific security needs

✅ Improved user experience with minimal disruption for legitimate users

More info:

🌐 https://techcommunity.microsoft.com/blog/azurenetworksecurityblog/securing-web-applications-with-azure-front-door-waf-captcha/4416502

 

Improve Azure App Service Performance by Disabling Application Insights Snapshot Debugger and Profiler

When you enable Application Insights on your Azure App Service, you gain powerful monitoring capabilities. However, two additional diagnostic tools—Snapshot Debugger and Profiler—can silently impact your application’s performance if left enabled without careful consideration.

 

  • Both Snapshot Debugger and Profiler attach extra processes/components (e.g., Snapshot Uploader, profiler agent) to your App Service worker process to monitor exceptions and collect traces.

  • When exceptions occur, Snapshot Debugger must analyze them, decide whether to capture, create a minidump, compress it, and upload it, which consumes CPU, memory, and disk I/O for tens to hundreds of seconds per snapshot.

  • Continuous or sampling-based profiling of live requests always adds some overhead; independent testing shows increased CPU usage and response times (for example, roughly 5–20% relative increase in CPU and latency when profiler sampling is active).

  • Based on the number of errors or slowness these can be also more aggressive – also depending on the SKU, if the machine might be already at let’s say 40% CPU –> you can see sometimes 40% additional CPU from SD while collecting traces.

I heard that the best method is to disable the Debugger and Profiler using Azure Portal. Using CLI you can invoke:

# Disable Profiler and Code Optimizations
az webapp config appsettings set –resource-group $(ResourceGroupName) –name $(AppService2Name) –settings APPINSIGHTS_PROFILERFEATURE_VERSION=disabled
# Disable Snapshot Debugger
az webapp config appsettings set –resource-group $(ResourceGroupName) –name $(AppService2Name) –settings APPINSIGHTS_SNAPSHOTFEATURE_VERSION=disabled
# Disable Diagnostic Extension
az webapp config appsettings set –resource-group $(ResourceGroupName) –name $(AppService2Name) –settings DiagnosticServices_EXTENSION_VERSION=disabled

 

 

Frontier Program – Onboard all AI innovation firstly

🔹 Feature: Frontier Program

🔹 What It Does: The Frontier Program bridges the gap between customers and Microsoft’s latest AI innovations. It’s designed to offer direct engagement with cutting-edge technology, empowering users to collaborate in shaping the future of AI.

 

✅ Hands-On Experience: Customers with Microsoft 365 Copilot licenses, enabled by their IT Admins, can dive into breakthrough features.

✅ Direct Feedback Loop: Share valuable insights directly with Microsoft’s product teams.

✅ Co-Creation Opportunity: Be a part of developing and refining AI technologies, not just adopting them.

✅ Early Access: Get ahead with firsthand experience of emerging AI capabilities.

 

More info:

🌐 https://adoption.microsoft.com/en-us/copilot/frontier-program/

 

Feature: Dev Tunnels

🔹 Feature: Dev Tunnels

🔹 What It Does: Dev Tunnels make it simple to securely expose your local services like http RDP and others to the internet for testing and development, without needing to deploy to a remote server.

 

What is it giving you:

 

✅ Secure Connections: Authorization during Dev Tunnel user login ensures you can connect only to your tunnel.

✅ Flexible Access: Use –allow-anonymous to permit public access when needed.

✅ Policy Control: Group policies help prevent unauthorized tunnel creation within your organization.

✅ Versatile Protocol Support: Expose not just HTTP, but also local SSH, RDP, or any other local port. A client is required on the client side.

✅ Cross-Platform Compatibility: Available on Linux, Windows, and MacOS for broad development support.

 

🌐 https://learn.microsoft.com/en-us/azure/developer/dev-tunnels/overview

Azure VMware Solution Interactive Demos

🔹 Feature: Azure VMware Solution Interactive Demos

🔹 What It Does: Azure VMware Solution Interactive Demos provide hands-on experiences that showcase how to efficiently run VMware workloads natively on Azure.

 

What It Gives You:

✅Deployment and Day-2 Operations

✅HCX Migration

✅Modernize workloads with Azure services

✅Capabilities of Arc-enabled AVS

✅Secure outbound internet connectivity

✅Expand storage with external storage (Azure NetApp Files)

✅AVS + Azure OpenAl Service

✅VMware Site Recovery Manager (SRM)

✅Expand storage with Elastic SAN

 

More info:

🌐 https://aka.ms/AVSDemos

Microsoft 365 Backup

🔹 Feature: Microsoft 365 Backup

🔹 What It Does: Safeguards your Office 365 data including Exchange, OneDrive, and SharePoint.

 

What It Gives You:

✅ Comprehensive Protection: Ensures all your critical data is securely backed up.

✅ Granular Restore Options: Restore specific files or entire datasets with precision.

✅ Policy-Based Management: Simplify backup configurations

✅ Seamless Integration: Works smoothly within your Microsoft 365 environment no third party solutions.

✅ DORA/NIST: Comply with regulatory requirements

 

🌐 https://learn.microsoft.com/en-us/microsoft-365/backup/?view=o365-worldwide

SQL Managed Instance Next-Gen

🔹 Feature: SQL Managed Instance Next-Gen

🔹 What It Does: Enhances performance, scalability, and cost-efficiency for your SQL workloads in Azure.

What is it giving you:

✅ Improved compute and storage scalability for demanding workloads.

✅ Enhanced security with built-in advanced features.

✅ Better cost management with optimized resource utilization.

✅ Seamless integration with Azure ecosystem for smoother operations.

🌐 More info: https://techcommunity.microsoft.com/blog/azuresqlblog/introducing-azure-sql-managed-instance-next-gen-gp/4092647

Azure VM Applications

🔹 Feature: VM Applications
🔹 What It Does: Simplifies the deployment, management, and updating of applications across your virtual machines not only in Azure.

What is it giving you:

✅ Effortless Software Distribution – Seamlessly deploy apps at scale across multiple VMs.
✅ Supports Azure Arc – Manage applications on both Azure and on-premises environments with Azure Arc integration.
✅ Streamlined Application Management – Simplify version control and application lifecycle management.
✅ Consistent Deployment – Ensure uniform application deployment across your infrastructure.

🌐 More info: https://learn.microsoft.com/en-us/azure/virtual-machines/vm-applications

 

Microsoft Learn AI Skills Navigator – A New Way to Learn

🔹 Feature: Microsoft Learn AI Skills Navigator – A New Way to Learn

Now, exciting news from #MSIgnite conferences or just before! 🚀

What is it giving you:

✅ AI-generated podcasts – just ask for a 5-minute podcast to learn while driving to work.

✅ Choose from diverse formats that fit your learning style—videos, labs, guides, and more.

✅ Instantly turn training into podcasts or summaries for faster, more flexible, and inclusive learning.

✅ Learn your way, on your time, with expert-led training combined with real-time AI support.

✅ Personalized AI-generated multimodal learning plans crafted to meet your unique needs.

✅ Skilling sessions designed to deepen your understanding and practical application.

✅ Live-like training available anytime with agentic support for a real-time learning experience.

✅ Instantly convert complex material into flexible learning modules, summaries, or podcasts for quick absorption.

More info:

🌐 https://aiskillsnavigator.microsoft.com

Python in Excel

🔹 Feature: Python in Excel

🌐 Video: https://youtu.be/Egos00xPEF4

🔹 What It Does: Seamlessly integrates Python capabilities directly into Excel, allowing advanced data analysis, visualization, and automation within familiar Excel workflows.

What is it giving you:

✅ Works in Cloud: No need instal Python locally

✅ Basic licenses enough: Office E3 or Buissness Office licenses are OK

✅ Safe: Python in Excel comes with a standard set of Python libraries provided by Anaconda through a secure distribution.

✅ Advanced Data Analysis: Run complex calculations and statistical models using Python libraries like pandas and NumPy directly in Excel.

✅ Powerful Visualizations: Create dynamic charts and graphs with libraries like Matplotlib and Seaborn for enhanced data storytelling.

✅ Automation Efficiency: Automate repetitive tasks, data manipulation, and reporting effortlessly within your spreadsheets.

✅ Seamless Integration: No need for external tools—work with Python and Excel side by side in the same environment.

✅ Enhanced Productivity: Boost your data handling capabilities without leaving the Excel interface you already know.

🌐 For more info: https://support.microsoft.com/en-us/office/get-started-with-python-in-excel-a33fbcbe-065b-41d3-82cf-23d05397f53d

MS SQL – Query Hint Recommendation Tool

🔹 Feature: Query Hint Recommendation Tool

🔹 What It Does: An extension to SQL Server Management Studio to optimize your queries efficiently.

What is it giving you:

✅ Establishes Baseline Performance

Ensure your queries are measured against consistent performance metrics.

✅ Identifies Helpful Hints Based on Actual Elapsed Time

Focus on real execution time metrics, not just optimizer estimated costs.

✅ Reports the Status of All Hints Explored

Keep track of which hints were tested and their impact on performance.

✅ Efficiency Boost

Skip irrelevant hints effortlessly.

Time-out suboptimal plans to avoid resource drain.

🌐 https://learn.microsoft.com/en-us/ssms/query-hint-tool/hint-tool-overview

 

The SharePoint + Copilot Advantage – Metadata Understanding

🔹 Feature: The SharePoint + Copilot Advantage – Metadata Understanding

🔹 What It Does: Metadata is the descriptive information attached to files (think tags, categories, dates, and custom labels). It transforms a basic document library into a dynamic, searchable knowledge base.

What It’s Giving You:

✅ Enhanced Search Precision: By enriching SharePoint content with metadata, organizations create clear boundaries, making information retrieval faster and more accurate.

✅ Reliable AI Outputs: Metadata acts as an anchor, helping Copilot and agents deliver consistent, business-ready answers while maintaining creative flexibility.

✅ Improved Business Decisions: Structured, metadata-rich environments enable AI to provide precise insights, crucial when business accuracy matters.

✅ Optimized Knowledge Management: Metadata transforms scattered information into organized, easily accessible business knowledge.

While Large Language Models (LLMs) remain probabilistic at their core, metadata helps refine AI’s precision, especially in business-critical scenarios.

🌐 Learn more: https://techcommunity.microsoft.com/blog/spblog/sharepoint-showcase-how-metadata-and-the-knowledge-agent-elevate-microsoft-365-c/4464079

 

Create memory dump of Azure App Service – manual way

Open KUDU with powershell.

List all processes

Get-Process | Sort-Object {$_.Threads.Count} -Descending | Select-Object ProcessName, Id, @{n=’Threads’;e={$_.Threads.Count}}, @{n=’Memory_MB’;e={[math]::Round($_.WorkingSet64/1MB,2)}} | Format-Table -AutoSize

Select the ID with more threads – not SCM one – you can confirm it in KUDU console

go to c:\devtools\sysinternals

cd c:\devtools\sysinternals

check current logs:

dir c:\home\logfiles

Create memory dump:

./procdump.exe -accepteula -ma 9888 c:\home\logfiles

it should end with:
[12:16:21] Dump 1 initiated: c:\home\logfiles\w3wp.exe_260115_121621.dmp
[12:16:34] Dump 1 writing: Estimated dump file size is 391 MB.
[12:17:02] Dump 1 complete: 391 MB written in 41.3 seconds
[12:17:03] Dump count reached.

 

Confirm that file is here:

dir c:\home\logfiles

Copilot Formula in Excel

🔹 Feature: Copilot Formula in Excel

🔹 What It Does: Supercharges your Excel experience with AI-driven capabilities, making data tasks smarter and faster!

✅ Condense Information: Summarize long strings or cell ranges into concise insights.

✅ Generate Sample Data: Quickly create placeholder or demo data for prototypes.

✅ Classify Content: Assign categories or sentiment tags to text entries effortlessly.

✅ Generate Text: Craft simple text content based on your data.

✨ Great for: Enhancing productivity, streamlining data analysis, and boosting content creation without extra hassle.

🌐 Discover more: https://support.microsoft.com/en-us/office/copilot-function-5849821b-755d-4030-a38b-9e20be0cbf62

 

MS SQL Migration Options

🔹 Highlight: SQL Migration Options

✅ SQL Server version upgrades including migrations to Azure → Use SQL Server Management Studio (SSMS)

What it gives you:

• Familiar, built-in tooling for DBAs

• Guided upgrade and compatibility checks

• Minimal disruption for in-place or side-by-side upgrades

✅ Migrations to Azure SQL Database → Use Data Migration Service (DMS)

What it gives you:

• Simplified cloud migration with guided workflows

• Online and offline migration support

• Reduced downtime and built-in validation

✅ Heterogeneous migrations to SQL Server → Use SQL Server Migration Assistant (SSMA)

What it gives you:

• Migration from Oracle, MySQL, PostgreSQL, and more

• Schema conversion and data migration in one tool

• Assessment reports to identify potential issues early

Nice page for browsing type of migration:

🌐 https://learn.microsoft.com/en-us/data-migration/

 

Upload to Azure Devops – This push was rejected because its size is greater than the 5120 MB limit for pushes in this repository.

During uploading fresh repository to GIT especially to Azure Devops – you can see:

This push was rejected because its size is greater than the 5120 MB limit for pushes in this repository.

The idea is to split across the commits. Here is a script that can do it automatically based on file size.

Invocation from directory of where you have repo: ..\push-one-dir.ps1 -RepoUrl “https://dev.azure.com/organisation/SVN/_git/gitrepo”

 

# Push repository in batches under 4500 MB
# Handles Polish/Unicode characters in filenames properly
# Run from the repository directory, e.g.: cd C:\svn\MNG then ..\push-one-dir.ps1
#..\push-one-dir.ps1 -RepoUrl “https://github.com/user/repo.git” -MaxBatchSizeMB 3000
#..\push-one-dir.ps1 -RepoUrl “https://github.com/user/repo.git” -InitializeRepo
param(
[Parameter(Mandatory=$true)]
[string]$RepoUrl,

[Parameter(Mandatory=$false)]
[int]$MaxBatchSizeMB = 4500,

[Parameter(Mandatory=$false)]
[switch]$InitializeRepo = $false
)

# Ensure UTF-8 encoding for proper handling of Polish characters
[Console]::OutputEncoding = [System.Text.Encoding]::UTF8
$PSDefaultParameterValues[‘Out-File:Encoding’] = ‘utf8’

# Start logging
$timestamp = Get-Date -Format “yyyyMMdd_HHmmss”
$repoName = Split-Path -Leaf (Get-Location)
$logFile = “C:\svn\logs\push_${repoName}_${timestamp}.log”

# Create logs directory if not exists
if (-not (Test-Path “C:\svn\logs”)) {
New-Item -ItemType Directory -Path “C:\svn\logs” | Out-Null
}

Start-Transcript -Path $logFile

Write-Host “========================================” -ForegroundColor Cyan
Write-Host “Starting push for: $repoName” -ForegroundColor Cyan
Write-Host “Repository URL: $RepoUrl” -ForegroundColor Cyan
Write-Host “Max batch size: $MaxBatchSizeMB MB” -ForegroundColor Cyan
Write-Host “Log file: $logFile” -ForegroundColor Cyan
Write-Host “Started at: $(Get-Date)” -ForegroundColor Cyan
Write-Host “========================================” -ForegroundColor Cyan

$maxBatchSizeBytes = $MaxBatchSizeMB * 1MB

# Configure Git for Unicode support
Write-Host “Configuring Git for Unicode support…” -ForegroundColor Yellow
git config core.quotePath false
git config i18n.commitEncoding utf-8
git config i18n.logOutputEncoding utf-8

# Initialize repo if requested
if ($InitializeRepo) {
if (Test-Path .git) {
Write-Host “Removing existing .git folder…” -ForegroundColor Yellow
Remove-Item -Recurse -Force .git
}

git init
git remote add origin $RepoUrl
}

# First commit – just .gitignore if it exists
if (Test-Path .gitignore) {
Write-Host “Adding .gitignore…” -ForegroundColor Yellow
git add .gitignore
git commit -m “Initial commit – gitignore” 2>$null
git push -u origin master –force
Write-Host “.gitignore pushed successfully” -ForegroundColor Green
} else {
Write-Host “No .gitignore found, skipping initial commit” -ForegroundColor Yellow
# Create an empty initial commit to establish the branch
git commit –allow-empty -m “Initial commit”
git push -u origin master –force
}

# Get all files using PowerShell’s Get-ChildItem (handles Unicode properly)
# This bypasses Git’s problematic path escaping
Write-Host “Scanning for files…” -ForegroundColor Yellow

$currentPath = (Get-Location).Path
$allFileObjects = Get-ChildItem -Recurse -File -Force | Where-Object {
$_.FullName -notmatch ‘\\\.git\\’ -and
$_.FullName -notmatch ‘\\\.svn\\’ -and
$_.Name -ne ‘.gitignore’
}

Write-Host “Found $($allFileObjects.Count) files to process” -ForegroundColor Cyan

# Check which files are already tracked by git
$trackedFiles = @{}
$gitLsFiles = git ls-files -z 2>$null
if ($gitLsFiles) {
$gitLsFiles -split “`0” | Where-Object { $_ -ne “” } | ForEach-Object {
$trackedFiles[$_] = $true
}
}

Write-Host “Already tracked: $($trackedFiles.Count) files” -ForegroundColor Cyan

$currentBatchSize = 0
$currentBatchFiles = @()
$batchNumber = 1
$skippedFiles = @()
$failedFiles = @()
$totalFilesProcessed = 0

foreach ($fileObj in $allFileObjects) {
# Get relative path
$relativePath = $fileObj.FullName.Substring($currentPath.Length + 1)
# Normalize path separators for git
$relativePath = $relativePath -replace ‘\\’, ‘/’

# Skip if already tracked
if ($trackedFiles.ContainsKey($relativePath)) {
continue
}

$fileSize = $fileObj.Length

# If single file exceeds limit, skip it and warn
if ($fileSize -gt $maxBatchSizeBytes) {
Write-Host “WARNING: Skipping ‘$relativePath’ – file too large ($([math]::Round($fileSize/1MB, 2)) MB)” -ForegroundColor Yellow
$skippedFiles += [PSCustomObject]@{
File = $relativePath
SizeMB = [math]::Round($fileSize/1MB, 2)
Reason = “Exceeds batch size limit”
}
continue
}

# If adding this file would exceed limit, push current batch first
if (($currentBatchSize + $fileSize) -gt $maxBatchSizeBytes -and $currentBatchFiles.Count -gt 0) {
Write-Host “”
Write-Host “========================================” -ForegroundColor Green
Write-Host “Pushing batch $batchNumber ($([math]::Round($currentBatchSize/1MB, 2)) MB, $($currentBatchFiles.Count) files)…” -ForegroundColor Green
Write-Host “========================================” -ForegroundColor Green

$batchFailed = @()
foreach ($batchFile in $currentBatchFiles) {
# Use the full path for adding to handle special characters
$fullPath = Join-Path $currentPath ($batchFile -replace ‘/’, ‘\’)

# Use -LiteralPath style by passing to git add
$result = git add — “$batchFile” 2>&1
if ($LASTEXITCODE -ne 0) {
Write-Host ” WARNING: Failed to add ‘$batchFile’: $result” -ForegroundColor Red
$batchFailed += $batchFile
}
}

if ($batchFailed.Count -gt 0) {
$failedFiles += $batchFailed
}

# Check if there’s anything to commit
$status = git status –porcelain
if ($status) {
git commit -m “Batch $batchNumber – $($currentBatchFiles.Count) files”

$pushResult = git push origin master 2>&1
if ($LASTEXITCODE -ne 0) {
Write-Host “ERROR: Push failed for batch $batchNumber” -ForegroundColor Red
Write-Host $pushResult -ForegroundColor Red

# Try to recover – maybe the batch is too large for the server
Write-Host “Attempting to reset and continue…” -ForegroundColor Yellow
git reset HEAD~1
} else {
Write-Host “Batch $batchNumber completed at $(Get-Date)” -ForegroundColor Cyan
$totalFilesProcessed += ($currentBatchFiles.Count – $batchFailed.Count)
}
} else {
Write-Host “No changes to commit in batch $batchNumber” -ForegroundColor Yellow
}

# Reset for next batch
$currentBatchSize = 0
$currentBatchFiles = @()
$batchNumber++

# Small delay to prevent overwhelming the server
Start-Sleep -Milliseconds 500
}

# Add file to current batch
$currentBatchFiles += $relativePath
$currentBatchSize += $fileSize
}

# Push remaining files
if ($currentBatchFiles.Count -gt 0) {
Write-Host “”
Write-Host “========================================” -ForegroundColor Green
Write-Host “Pushing final batch $batchNumber ($([math]::Round($currentBatchSize/1MB, 2)) MB, $($currentBatchFiles.Count) files)…” -ForegroundColor Green
Write-Host “========================================” -ForegroundColor Green

$batchFailed = @()
foreach ($batchFile in $currentBatchFiles) {
$result = git add — “$batchFile” 2>&1
if ($LASTEXITCODE -ne 0) {
Write-Host ” WARNING: Failed to add ‘$batchFile’: $result” -ForegroundColor Red
$batchFailed += $batchFile
}
}

if ($batchFailed.Count -gt 0) {
$failedFiles += $batchFailed
}

$status = git status –porcelain
if ($status) {
git commit -m “Batch $batchNumber – $($currentBatchFiles.Count) files (final)”

$pushResult = git push origin master 2>&1
if ($LASTEXITCODE -ne 0) {
Write-Host “ERROR: Push failed for final batch” -ForegroundColor Red
Write-Host $pushResult -ForegroundColor Red
} else {
Write-Host “Batch $batchNumber completed at $(Get-Date)” -ForegroundColor Cyan
$totalFilesProcessed += ($currentBatchFiles.Count – $batchFailed.Count)
}
} else {
Write-Host “No changes to commit in final batch” -ForegroundColor Yellow
}
}

# Verify what’s still untracked
Write-Host “”
Write-Host “Verifying repository status…” -ForegroundColor Yellow
$remainingUntracked = git status –porcelain | Where-Object { $_ -match ‘^\?\?’ }
$remainingCount = ($remainingUntracked | Measure-Object).Count

# Summary
Write-Host “”
Write-Host “========================================” -ForegroundColor Cyan
Write-Host “SUMMARY” -ForegroundColor Cyan
Write-Host “========================================” -ForegroundColor Cyan
Write-Host “Total batches pushed: $batchNumber” -ForegroundColor Green
Write-Host “Total files processed: $totalFilesProcessed” -ForegroundColor Green
Write-Host “Remaining untracked: $remainingCount” -ForegroundColor $(if ($remainingCount -eq 0) { “Green” } else { “Yellow” })
Write-Host “Completed at: $(Get-Date)” -ForegroundColor Green

if ($skippedFiles.Count -gt 0) {
Write-Host “”
Write-Host “Skipped files (too large):” -ForegroundColor Yellow
$skippedFiles | Format-Table -AutoSize
}

if ($failedFiles.Count -gt 0) {
Write-Host “”
Write-Host “Failed to add ($($failedFiles.Count) files):” -ForegroundColor Red
$failedFiles | ForEach-Object { Write-Host ” $_” -ForegroundColor Red }
}

if ($remainingCount -gt 0) {
Write-Host “”
Write-Host “Still untracked files ($remainingCount):” -ForegroundColor Yellow
$remainingUntracked | Select-Object -First 20 | ForEach-Object {
Write-Host ” $_” -ForegroundColor Yellow
}
if ($remainingCount -gt 20) {
Write-Host ” … and $($remainingCount – 20) more” -ForegroundColor Yellow
}

# Save remaining files to a separate log
$remainingLogFile = “C:\svn\logs\remaining_${repoName}_${timestamp}.txt”
$remainingUntracked | Out-File -FilePath $remainingLogFile -Encoding UTF8
Write-Host “”
Write-Host “Full list of remaining files saved to: $remainingLogFile” -ForegroundColor Cyan
}

Write-Host “”
Write-Host “Log saved to: $logFile” -ForegroundColor Cyan

Stop-Transcript

Before I suggest to invoke:

git config –global core.autocrlf false

git config –global core.longpaths true

git config —global http.postBuffer 524288000

Microsoft Fabric

Microsoft Fabric has been around for a few years now and brings data tools together in one SaaS platform — from copying databases to running analytics on the fly. Before, we often used separate services like Azure Data Factory, but now it’s all in one place.  Are you using Microsoft Fabric already, or sticking with Azure services like Azure Data Factory? If not, what’s holding you back? Would love to hear your experience!

🔹 Feature: Microsoft Fabric

🔹 What It Does: A unified data platform designed for seamless data transformation

What Is It Giving You:

🚀 Now enhanced with capabilities to ingest, process, transform, and route events using Eventstream—making your data flow smarter and faster.

🚀 Now with new database mirroring options from SQL 2025 – so not only Change Data Capture

✅ Data Factory: Simplify data integration and orchestration across diverse sources.

✅ Analytics: Gain powerful insights with advanced analytical capabilities.

✅ Databases: Manage structured and unstructured data efficiently.

✅ Real-Time Intelligence: Analyze and respond to data events in real-time.

✅ Power BI: Transform data into rich, interactive visual reports.

More info:

🌐 https://learn.microsoft.com/en-us/fabric/fundamentals/microsoft-fabric-overview#components-of-microsoft-fabric

Teams Meeting Facilitator

🔹 Feature: Teams Meeting Facilitator

🔹 What It Does: Enhances your Teams meetings by streamlining productivity and collaboration.

What is it giving you:

✅ Establish meeting goals & agenda

✅ Track agenda, moderate discussions, and manage time effectively

✅ Take real-time, collaborative notes—editable by all attendees, even those without a Copilot license, during the meeting

✅ Answer questions about the meeting or pull information from the web

✅ Capture and manage tasks

✅ Create documents based on discussions

✅ Enhanced focus: Attendees can concentrate on the discussion while Facilitator handles note-taking

✅ Seamless collaboration: Meeting notes powered by Loop, fully editable and shareable like all Loop pages

✅ Efficiency: Real-time updates ensure everyone stays aligned

Pre-requisites:

⚠️ At least one participant with a Microsoft 365 Copilot License to activate Facilitator

⚠️ Transcription must be enabled

⚠️ “Loop experiences in Teams” functionality enabled

More info:

🌐 https://support.microsoft.com/en-us/office/facilitator-in-microsoft-teams-meetings-37657f91-39b5-40eb-9421-45141e3ce9f6

 

Azure IoT Operations

🔹 Feature: Azure IoT Operations

🔹 What It Does: Azure IoT Operations is a robust set of data services designed to run on Azure Arc-enabled edge Kubernetes clusters. It streamlines IoT deployments, ensuring efficient data flow and asset management.

What is it giving you:

✅ MQTT Broker: Powers event-driven architectures at the edge.

✅ Akri Connectors: Facilitates easy communication, e.g., with OPC UA servers.

✅ Data Flows: Enables data transformation, contextualization, and routing.

✅ Operations Experience: A user-friendly web UI for managing assets and data flows seamlessly.

More info:

🌐 https://learn.microsoft.com/en-us/azure/iot-operations/overview-iot-operations

🌐 https://learn.microsoft.com/en-us/azure/iot-operations/get-started-end-to-end-sample/quickstart-deploy (Great Demo that can be used also to deploy the other Azure Arc Kubernetes Connected Clusters)

Oracle Database@Azure

🔹 Feature: Oracle Database@Azure

🔹 What It Does: Brings mission-critical Oracle Database workloads directly into Azure with OCI-powered infrastructure, unified operations, and full feature/licensing parity — all managed using Azure-native tools.

What is it giving you?

✅ Unified Management with Azure Arc: Manage Oracle databases and Azure resources seamlessly through a single control plane. Enforce policies and ensure security uniformly across environments.

✅ Flexible Licensing: Leverage the Bring Your Own License (BYOL) model for cost efficiency.

✅ High performance, resiliency & microsecond latency: Powered by OCI’s RDMA-based architecture now running inside Azure datacenters — delivering predictable, enterprise-grade throughput.

✅ Available in 30+ regions by end of 2025

✅ MACC eligible: Spend on Oracle Database@Azure contributes to your Microsoft Azure Consumption Commitment.

✅ Flexible Plans with High Availability (HA) Features:

BRONZE – Foundational reliability for non-critical workloads  

SILVER – Enhanced availability & backup automation  

GOLD – Advanced HA, failover capabilities & rapid recovery  

PLATINUM – Highest resiliency, multi-zone HA & enterprise-grade continuity

More info:

🌐 https://techcommunity.microsoft.com/blog/oracleonazureblog/oracle-databaseazure-at-oracle-ai-world-2025-powering-the-next-wave-of-enterpris/4460749

 

Azure Kubernetes Fleet Manager

🔹 Feature: Azure Kubernetes Fleet Manager

🔹 What It Does: Simplifies multicluster management for AKS with workload propagation, load balancing, and upgrade orchestration.

What It’s Giving You:

✅ Group and Organize All Your AKS Clusters: Gain streamlined visibility and control over all your AKS clusters in one place.

✅ Get a Managed Experience: Azure handles the complexity, offering a fully managed approach to multi-cluster management.

✅ Propagate Kubernetes Configurations Across Clusters: Ensure consistency and compliance by easily distributing Kubernetes configurations to multiple clusters.

✅ Orchestrate Multi-Cluster Networking Scenarios: Simplify complex networking with built-in support for multi-cluster communication and load balancing.

✅ Support for Azure Arc–Enabled Kubernetes: Extend Fleet management to on-premises and multi-cloud Kubernetes clusters through Azure Arc integration.

More info:

🌐 https://azure.microsoft.com/en-us/products/kubernetes-fleet-manager#features

 

Azure Service Groups – the next evolution beyond Management Groups when they aren’t enough

🔹 Featured: Azure Service Groups – the next evolution beyond Management Groups when they aren’t enough.

🔹 What It Does: Azure Service Groups allow you to group resources across multiple subscriptions. They enable streamlined resource management with minimal permissions, ensuring security without over-privileging access.

What is it giving you:

✅ Cross-Subscription Grouping: Seamlessly organize resources across different subscriptions for better governance.

✅ Minimal Permissions Required: Manage resource groups effectively without needing excessive access rights.

✅ Enhanced Resource Management: Simplify administrative tasks and improve efficiency by grouping related resources logically.

✅ Secure and Scalable: Designed to support complex enterprise environments while maintaining robust security.

More info:

🌐 https://learn.microsoft.com/en-us/azure/governance/service-groups/overview

 

Pi Zero as RDP Jump Host with xrdp, NeutrinoRDP Proxy and Azure Arc

Project goal and assumptions

The goal of this project is to build a small, energy‑efficient jump host that enables remote RDP access to Windows machines inside a corporate network without exposing that network directly to the Internet. Instead of forwarding port 3389 on the router (which usually results in scans and brute‑force attempts), the Raspberry Pi operates on the local network and acts as a secure intermediary: it accepts the administrator’s connection and then establishes an RDP session to the selected Windows host.
The biggest advantage is that the Raspberry Pi does not need a public IP address or any open inbound ports. Administrative access is provided through Azure Arc (SSH over Arc), which makes it possible to connect via SSH to Arc‑enabled servers without a public IP and without opening additional ports.

Why Ethernet and not Wi‑Fi?

In a jump host scenario, reliability matters more than convenience. This type of device is often used to help other users or to fix outages – exactly when “unstable Wi‑Fi” can be most disruptive. For this reason, it is best to connect the Raspberry Pi to the internal network using a wired Ethernet connection.
At the same time, Wi‑Fi plays an interesting backup/architectural role here: if the LAN (with the Windows hosts – and not only them, since SSH is also in play) has no Internet access for security reasons, the Raspberry Pi can still reach the “outside world” (Azure) over Wi‑Fi. In practice, this provides an additional benefit: you can keep the server network “isolated” while still retaining the ability to perform remote administration via Arc.

How the RDP proxy works (xrdp + NeutrinoRDP)

The RDP “jump” layer is implemented using xrdp running on the Raspberry Pi in proxy mode with NeutrinoRDP. NeutrinoRDP is part of the neutrinolabs ecosystem and is a fork of FreeRDP 1.0.1, providing the RDP client implementation used in the proxy scenario.
One important practical detail: the NeutrinoRDP proxy does not always work “out of the box” after installing xrdp from distribution packages. In many distributions, the packages do not include ready‑to‑use proxy libraries/modules, so to obtain a working module (for example libxrdpneutrinordp.so), you must compile NeutrinoRDP from source and then rebuild xrdp with NeutrinoRDP support enabled (for example ./configure –enable-neutrinordp).

Azure Arc – access without a public IP

The Raspberry Pi acts as an edge server and is onboarded to Azure Arc‑enabled servers. Thanks to this, it is possible to connect to the Pi over SSH “through Azure” (SSH over Arc), even if the Raspberry Pi is behind NAT and has no port forwarding configured. Microsoft describes this mode as SSH access to Arc‑enabled servers without exposing them directly to the Internet.

Step by Step:

sudo apt-get update
sudo apt-get -y install mc build-essential git cmake libssl-dev libx11-dev libxext-dev libxinerama-dev libxcursor-dev libxdamage-dev libxv-dev libxkbfile-dev libasound2-dev libcups2-dev libxml2 libxml2-dev libxrandr-dev libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libavutil-dev libavcodec-dev libavformat-dev libswscale-dev
git clone https://github.com/neutrinolabs/NeutrinoRDP.git
cd NeutrinoRDP
cmake -DCMAKE_BUILD_TYPE=Release -DWITH_SSE2=OFF .
make

if error than: sed -i '/#include <freerdp\/rail.h>/a #include <stdlib.h>' /home/mf/NeutrinoRDP/libfreerdp-utils/rail.c

make clean
make

sudo make install

sudo apt-get install -y gcc make libssl-dev libpam0g-dev libx11-dev libxfixes-dev libxrandr-dev mc autoconf automake libtool libxkbfile-dev
cd ~
git clone https://github.com/neutrinolabs/xrdp.git
cd xrdp

./bootstrap
./configure --enable-neutrinordp
make

if error: sed -i 's/self->client_info.jpeg_prop\[0\] < 0 ||//' libxrdp/xrdp_caps.c
make clean
make

sudo make install
sudo systemctl enable xrdp
sudo systemctl restart xrdp.service

Azure SQL Managed Instance Next-gen / SQL 2025 / SQL in Fabric

🔹 Feature: Azure SQL Managed Instance Next-gen / SQL 2025 / SQL in Fabric

🔹 What It Does: Enables next-level performance, scalability, and integration for your data solutions across Fabric – on premise and Azure

What is it giving you:

✅ Scale up to 128 Cores for enhanced processing power (Managed Instance)

✅ Up to 870 GB of memory to support demanding workloads (Managed Instance)

✅ Up to 32 TB of storage for massive data capacity (Managed Instance)

✅ Host up to 500 DBs per instance, ideal for large-scale deployments (Managed Instance)

✅ DiskANN Vector Indexing for faster search performance (also in SQL 2025)

✅ Real-time data streaming from SQL to Event Hubs (also in SQL 2025)

✅ Seamless mirroring of Azure database to Fabric updates for continuity

✅ Simplified migrations with SQL Server Arc-enabled Migration

✅ Oracle Code Conversion Copilot to ease transition from Oracle systems

✅ Enhanced security with Customer Managed Keys (CMK) in SQL Fabric

Stay tuned for a forthcoming post that will showcase enhanced performance for Azure SQL Managed Instance Next-gen.

 

Azure AI Foundry – Advanced Multi-Agent Platform, MCP Integration & Enterprise Observability

🔹 Feature: Azure AI Foundry – Advanced Multi-Agent Platform, MCP Integration & Enterprise Observability

🔹 What It Does: Azure AI Foundry introduces a cutting-edge, production-grade architecture for building AI agent systems, featuring native orchestration, unified data layers, and secure identity/control across the entire agent lifecycle.

What is it giving you:

🔧 Foundry Agent Service

Microsoft launches a powerful runtime and hosting layer for AI agents:

✅ Supports Microsoft Agent Framework, LangGraph, and third-party orchestrators.

✅ Native multi-agent orchestration with seamless message passing, routing, delegation, and hierarchical agent patterns.

✅ Automatic MCP (Model Context Protocol) tool discovery:

Converts custom tools & APIs into MCP servers.

Enables dynamic tool discovery with full schema + policy visibility.

✅ Serverless execution model — effortlessly scales agents up/down without infrastructure management.

In practice: Azure AI Foundry transforms into a cloud-native agent runtime, far beyond just a model hosting platform.

📚 Foundry Knowledge

A unified knowledge layer designed to tackle data fragmentation complexities in RAG pipelines:

✅ Connects seamlessly with 1,400+ enterprise systems (Dynamics, SAP, SQL, Salesforce, ServiceNow, etc.).

✅ Automates data ingestion, chunking, metadata extraction, and vectorization.

✅ Offers a normalized semantic layer with unified permissions (inherits M365 + Entra RBAC).

✅ No need for separate data pipelines — agents query a single Knowledge API.

Architecturally: It mirrors an enterprise-wide semantic data fabric:

Centralized governance → Distributed access → Consistent authorization → Unified embeddings.

📈 Foundry Observability

Comprehensive observability for agent-based systems:

✅ Request/response tracing across complex multi-agent chains.

✅ Evaluation pipelines for quality, grounding, and hallucination detection.

✅ Vector drift detection and RAG quality analytics.

✅ Guardrail telemetry tracking blocked calls, policy violations, and safety triggers.

✅ Cost analysis per agent, tool, workflow, and execution step.

Agent Control Plane (Preview) adds:

✅ Fleet-wide metrics across subscriptions and workspaces.

✅ Visual dependency maps (agents ↔ models ↔ tools ↔ data sources).

✅ Health scoring and proactive anomaly alerts.

Effectively: This brings APM-style observability (like Application Insights) to AI agents.

🔐 Security: Agent Identity, Purview Integration & Zero-Trust Enforcement

✅ Agent ID (via Microsoft Entra): Grants agents first-class identity to authenticate, request tokens, and operate autonomously or on behalf of users.

✅ Purview Integration:

Automates data classification, sensitivity labeling, and lineage tracking.

Enforces policies at every RAG/agent retrieval and tool interaction.

✅ Defender Integration:

Detects threats in agent messaging, tool calls, and model outputs.

Protects against prompt injection, data exfiltration attempts, and anomalous behaviors.

This sets the benchmark for Zero-Trust architecture in multi-agent systems.

More info:

🌐 https://aka.ms/azureaifoundry

🌐 https://learn.microsoft.com/azure/ai-studio/

 

⚠️ Please be aware about rebranding to Microsoft Foundry – https://learn.microsoft.com/en-us/azure/ai-foundry/what-is-azure-ai-foundry?view=foundry#microsoft-foundry-portals

1 2 3 4 5 >»
Projekt i wykonanie: Mobiconnect i fast-sms.net   |    Regulamin
Ta strona korzysta z ciasteczek aby świadczyć usługi na najwyższym poziomie. Dalsze korzystanie ze strony oznacza, że zgadzasz się na ich użycie.Zgoda

Added to Cart

Keep Shopping