Wiedza
  • 0 Koszyk
  • Kontakt
  • Moje konto
  • Blog
  • MOC On-Demand – co to takiego?
  • MOC On-Demand – Co zyskujesz?
  • Kursy MS

This device is joined to Azure AD. To join an Active Directory domain, you must first go to settings and choose to disconnect your device from your work or school

Sometime you can see this message, when you try to join computer to the domain – even server.

 

Computer Name/Domain Changes The following error occurred attempting to join the domain This device is joined to Azure AD. To join an Active Directory domain, you must first go to settings and choose to disconnect your device from your work or school

The massage is clear – so solution is to execute:

 

dsregcmd /status
DSRegCmd /Leave
dsregcmd /status

 

 

 

virtual-environments GithubActions, Azure Devops – Cannot bind argument to parameter ‘ApplicationId’ because it is an empty string.

During building in Azure virtual-environments (https://github.com/actions/virtual-environments) you can see:

New-AzADAppCredential : Cannot bind argument to parameter ‘ApplicationId’ because it is an empty string.
At C:\virtual-environments\helpers\GenerateResourcesAndImage.ps1:224 char:13
+ $appCred = New-AzADAppCredential -ApplicationId $sp.AppId …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [New-AzADAppCredential], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAllowed,New-AzADAppCredential

In that case you must add:

  • AzureClientId
  • AzureClientSecret
  • AzureTenantId

So the whole command:

GenerateResourcesAndImage -SubscriptionId xxxx -ResourceGroupName “DevOpsPackerAgent” -ImageGenerationRepositoryRoot “$pwd” -ImageType Windows2019 -AzureLocation “uksouth” -AzureClientId “yyy” -AzureClientSecret “zzz” -AzureTenantId “qqq”

Teams – share files between organizations, tenants.

Sometimes I need to explain several times differences between Guests and “non-guest” experience in Microsoft Teams.

https://docs.microsoft.com/en-us/microsoftteams/communicate-with-users-from-other-organizations

Azure Cost Savings – Change all disk to standard

This script is just as a reminder that using Power-Shell you can change disk SKU for your VM – in that way you can save cost. You can build solution that it will be changed for switched off VM and powered on also.

#$SubscrybtionDev=”d2e666be-5fde-451b-84c9-9c545b3e435c”
#Connect-AzAccount -UseDeviceAuthentication
#Set-AzContext -Subscription $SubscrybtionDev
$RGs = Get-AzResourceGroup
$storageType = ‘Standard_LRS’
foreach ($RG in $RGs)
{
$vmDisks = Get-AzDisk -ResourceGroupName $RG.ResourceGroupName
foreach ($disk in $vmDisks)
{
$disk.Sku = [Microsoft.Azure.Management.Compute.Models.DiskSku]::new($storageType)
$disk | Update-AzDisk
Write-Host $disk.Id
}
}

Connect to Azure Rest API via CURL

As a user:

curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
apt install -y jq
az login
az account get-access-token
declare subid=”xxx”
declare response=$(az account get-access-token)
declare token=$(echo $response | jq “.accessToken” -r)

curl -i -X GET -H “x-ms-version: 2018-11-09” -H “content-length: 0” -H “Authorization: Bearer $token” “https://management.azure.com/subscriptions/xxx/resourceGroups/nvx-demo-sec/providers/Microsoft.Compute/virtualMachines?api-version=2021-07-01”

As a application:

declare TENANT_NAME=”xxx”
# Values for the first app registration
declare CLIENT_ID1=”xxx”
declare CLIENT_SECRET1=”xxx”

ACCESS_TOKEN=$(curl -X POST -H “Content-Type: application/x-www-form-urlencoded” –data-urlencode “client_id=$CLIENT_ID1” –data-urlencode “client_secret=$CLIENT_SECRET1” –data-urlencode “scope=https://storage.azure.com/.default” –data-urlencode “grant_type=client_credentials” “https://login.microsoftonline.com/$TENANT_NAME/oauth2/v2.0/token” | jq -r ‘.access_token’)

curl -i -X GET -H “x-ms-version: 2018-11-09” -H “content-length: 0” -H “Authorization: Bearer $ACCESS_TOKEN” “https://management.azure.com/subscriptions/xxx/resourceGroups/nvx-demo-sec/providers/Microsoft.Compute/virtualMachines?api-version=2021-07-01”

Azure Virtual WAN – Force Tunneling

* Force tunnel to NVA You can specify a 0.0.0.0/0 route in the defaultRouteTable with next hop Virtual Network Connection. Then specify the specific IP of the NVA. This will force all internet-bound traffic to be sent to a Network Virtual Appliance deployed in a spoke Virtual Network. For more detailed instructions, please consider the alternate workflow described here: Route traffic through NVAs by using custom settings – Azure Virtual WAN | Microsoft Docs.

* Force tunnel to Azure Firewall in the Hub You can use Firewall Manager to configure Virtual WAN to send all internet-bound traffic via Azure Firewall deployed in the Virtual WAN hub. For configuration steps and a tutorial, please reference following documents (Install Azure Firewall in a Virtual WAN hub – Azure Virtual WAN | Microsoft Docs and to configure routing Tutorial: Secure your virtual hub using Azure Firewall Manager | Microsoft Docs) Alternatively, this can also be configured via Routing Policies and Routing Intent. For more information on Routing policies please read the following document How to configure Virtual WAN Hub routing policies – Azure Virtual WAN | Microsoft Docs.

* Force tunnel to Third party provider: You can use Firewall Manager to send internet traffic via a third-party security provider. For more information on this capability, please read the following: Deploy an Azure Firewall Manager security partner provider | Microsoft Docs.

* Force tunnel to a branch You can configure one of your branches (Site-to-site VPN, ExpressRoute Circuit or Network Virtual Appliance in the Virtual WAN Hub) to advertise the 0.0.0.0/0 route to Virtual WAN. Your on-premises device will have to be configured to do that.

 

The info is form Microsoft and can be very helpful.

Back to young… Small Basic from Microsoft

More than 25 years ago I started programming using BASIC 2.0 (Commodore 64) – The Internet says that it was connected with Microsoft also…

 

But nowadays you can use Small Basic – also from Microsoft:

You can download it from: https://smallbasic-publicwebsite.azurewebsites.net/

You can also export these to Visual Basic using this manual:

https://social.technet.microsoft.com/wiki/contents/articles/38265.small-basic-instructions-to-graduate-and-debug-with-visual-studio-2017.aspx

Simply two steps after exporting:

  1. The target application is a .NET 4.5
  2. Add a reference to SmallBasicLibrary.dll
  3. In case of errors, replace “0.5” with “CType(0.5, Primitive)” to cast.  To cast means to convert variable type.  This sample changes type real (0.5) to Primitive (for all Small Basic variables).

 

This one is a good way for starting learning: https://github.com/olik-son/instalogik—Zadanie/blob/main/Rozwiazanie.sb

 

Kolejne 3 błędy, które mogą doprowadzić do kompromitacji Twoich zasobów w chmurze

Zapraszam do udziału w The Hack Summit 2021, gdzie będę miał przyjemność prezentować następującą sesje:

Kolejne 3 błędy, które mogą doprowadzić do kompromitacji Twoich zasobów w chmurze.

Przy okazji, jeżeli chcą Państwo otrzymać, oczywiście bezpłatnie dokumentacje wdrożenia mechanizmu inwentaryzacji i kopii konfiguracji (ARM) Microsoft Azure, która omawiam podczas wydarzenia oraz dokument jak bezpiecznie skonfigurować usługę WebApp / App Service w Microsoft Azure – proszę wypełnić poniższe zgłoszenie.

AWS CLI delete multiple snapshots using Name Tag

First of all display commands if it is correct to delete snapshot, using:

aws ec2 describe-snapshots –output text –filters Name=tag:Name,Values=tag-value|grep SNAPSHOTS|awk ‘{print “Deleting-> ” $4,$6,$8,$9,$10; system(“echo aws ec2 delete-snapshot –snapshot-id ” $10)}’

and than you can delete snapshots:

aws ec2 describe-snapshots –output text –filters Name=tag:Name,Values=tag-value|grep SNAPSHOTS|awk ‘{print “Deleting-> ” $4,$6,$8,$9,$10; system(“aws ec2 delete-snapshot –snapshot-id ” $10)}’

Zero Trust Model – Audit For Free

You can visit:

https://aka.ms/ZTtool

and get access for free online audit security in your company.

After answering some question, you will receive how to fix potential issues.

Here are all topics to cover, all that the web question based tool can generate.

 

Identities:

 

Implement multifactor authentication.

  1. Multifactor authentication helps protect your applications by requiring users to confirm their identity using a second source of validation, such as a phone or token, before access is granted.
  2. Azure Active Directory (Azure AD) can help you enable multifactor authentication for free.
  3. Already have Azure AD? Start deploying today.

 
 

Enable passwordless authentication.

  1. Passwordless authentication methods such as Windows Hello and Microsoft Authenticator provide a simpler and more secure authentication experience across the web and mobile devices. Based on the recently developed FIDO2 standard, these methods allow users to authenticate easily and securely without requiring a password.
  2. Microsoft can help you adopt passwordless authentication today. Download the passwordless authentication datasheet to learn more.
  3. If you already have Azure Active Directory (Azure AD), see how you can enable passwordless authentication today.

 
 

Implement single sign-on (SSO).

  1. SSO not only strengthens security by removing the need to manage multiple credentials for the same person but also delivers a better user experience with fewer sign-in prompts.
  2. Microsoft Azure Active Directory (Azure AD) provides an SSO experience to popular software as a service (SaaS) apps, on-premises apps, and custom-built apps that reside on any cloud for any user type and any identity.
  3. Plan your SSO deployment.

 
 

Enforce access controls with adaptive, risk-based policies.

  1. Move beyond simple access/block decisions and tailor decisions based on risk appetite—such as allowing access, blocking, limiting access, or requiring additional proofs like multifactor authentication.
  2. Use conditional access in Azure AD to enforce fine-tuned adaptive access controls, such as requiring multifactor authentication, based upon user context, device, location, and session risk information.
  3. Plan your conditional access deployment.

 
 

Block legacy authentication.

  1. One of the most common attack vectors for malicious actors is to use stolen or replayed credentials against legacy protocols, such as SMTP, that can’t use modern security challenges.
  2. Conditional access in Azure AD can help you block legacy authentication. See more information about Block Legacy Authentication.

 
 

Protect identities against compromise.

  1. Real-time risk assessments can help protect against identity compromise at the time of login and during sessions.
  2. Azure Identity Protection delivers real-time continuous detection, automated remediation, and connected intelligence to investigate risky users and sign-ins to address potential vulnerabilities.
  3. Enable Identity Protection to get started. Bring in user session data from Microsoft Cloud App Security to enrich Azure AD with possible risky user behavior after they were authenticated.

 
 

Enrich your Identity and Access Management (IAM) solution with more data.

  1. The more data you feed your IAM solution, the more you can improve your security posture with granular access decisions and better visibility into users accessing corporate resources, and the more you can tailor the end-user experience.
  2. Azure Active Directory (Azure AD), Microsoft Cloud App Security, and Microsoft Defender for Endpoint all work together to provide enriched signal processing for better decision making.
  3. Configure Conditional Access in Microsoft Defender for Endpoint, Microsoft Defender for Identity, and Microsoft Cloud App Security.

 
 

Fine-tune your access policies.

  1. Enforce granular access control with risk-based adaptive access policies that integrate across endpoints, apps, and networks to better protect your data.
  2. Conditional Access in Azure AD enables you to enforce fine-tuned adaptive access controls, such as requiring multi-factor authentication, based upon user context, device, location, and session risk information.
  3. Fine-tune your Conditional Access policies.

 
 

Improve your identity security posture.

  1. The identity secure score in Azure AD helps you assess your identity security posture by analyzing how well your environment aligns with Microsoft best-practice recommendations for security.
  2. Get your identity secure score

 

Endpoints:

 

Register your devices with your identity provider.

  1. In order to monitor security and risk across multiple endpoints used by any one person, you need visibility in all devices and access points that may be accessing your resources.
  2. Devices can be registered with Microsoft Azure AD, providing you with visibility into the devices accessing your network and the ability to utilize device health and status information in access decisions.
  3. Configure and manage device identities in Azure AD

 
 

Enroll devices in Mobile Device Management for internal users.

  1. Once data access is granted, having the ability to control what the user does with your corporate data is critical to mitigating risk.
  2. Microsoft Endpoint Manager enables endpoint provisioning, configuration, automatic updates, device wipe, and other remote actions.
  3. Set up Mobile Device Management for internal users.

 
 

Ensure compliance before granting access.

  1. Once you have identities for all the endpoints accessing corporate resources and before access is granted, you want to ensure that they meet the minimum security requirements set by your organization.
  2. Microsoft Endpoint Manager can help you set compliance rules to ensure that devices meet minimum-security requirements before access is granted. Also, set remediation rules for noncompliant devices so that people know how to resolve the issue.
  3. Set rules on devices to allow access to resources in your organization using Intune

 
 

Enable access for unmanaged devices as needed.

  1. Enabling your employees to access appropriate resources from unmanaged devices can be critical to maintaining productivity. However, it’s imperative that your data is still protected.
  2. Microsoft Intune Mobile Application Management lets you publish, push, configure, secure, monitor, and update mobile apps for your users, ensuring they have access to the apps they need to do their work.
  3. Configure access for unmanaged devices.

 
 

Enroll devices in Mobile Device Management for external users.

  1. Enroll external devices Enrolling devices from external users (such as contractors, vendors, partners, etc.) into your MDM solution is a great way to ensure your data is protected and they have the access they need to do their work.
  2. Microsoft Endpoint Manager provides endpoint provisioning, configuration, automatic updates, device wipe, and other remote actions.
  3. Set up Mobile Device Management for external users.

 
 

Enforce data loss prevention policies on your devices.

  1. Once data access is granted, controlling what the user can do with your data is critical. For example, if a user accesses a document with a corporate identity, you want to prevent that document from being saved in an unprotected consumer storage location, or from being shared with a consumer communication or chat app.
  2. Intune app protection policies will help protect data with or without enrolling devices in a device management solution by restricting access to company resources and keep data within the purview of your IT department.
  3. Get started with Intune App policies.

 
 

Enable real-time device risk evaluation.

  1. Ensuring only healthy and trusted devices are allowed access to your corporate resources is a critical step in a Zero Trust journey. Once your devices are enrolled with your identity provider, you can bring that signal into your access decisions to only allow safe and compliant devices access.
  2. Through integration with Azure AD, Microsoft Endpoint Manager enables you to enforce more granular access decisions and fine-tune the Conditional Access policies based on your organization’s risk appetite. For example, excluding certain device platforms from accessing specific apps.
  3. Configure Conditional Access in Microsoft Defender for Endpoint

 

Apps

 

Enforce policy-based access control for your apps.

  1. Move beyond simple access/block decisions and tailor decisions based on risk appetite—such as allowing access, blocking, limiting access, or requiring additional proofs like multi-factor authentication.
  2. Conditional Access in Azure AD enables you to enforce fine-tuned adaptive access controls, such as requiring multi-factor authentication, based upon user context, device, location, and session risk information.
  3. Configure Conditional Access for your app access

 
 

Enforce policy-based session controls.

  1. Stopping breaches and leaks in real time before employees intentionally or inadvertently put data and organizations at risk is key to mitigating risk after access is granted. Simultaneously, it’s critical for businesses to enable employees to securely use their own devices.
  2. Microsoft Cloud App Security (MCAS) integrates with Azure Active Directory (Azure AD) conditional access so you can configure apps to work with Conditional Access App Control. Easily and selectively enforce access and session controls on your organization’s apps based on any condition in conditional access (such as preventing data exfiltration, protecting on download, preventing uploads, blocking malware, and more).
  3. Create a Microsoft Cloud App Security session policy to get started.

 
 

Connect your business apps to your cloud application security broker (CASB).

  1. Visibility across apps and platforms is critical for performing governance actions, such as quarantining files or suspending users, as well as mitigating any flagged risk.
  2. Apps connected to Microsoft Cloud App Security (MCAS) get instant, out-of-the-box protection with built-in anomaly detection. MCAS uses entity and user behavioral analytics (UEBA) and machine learning to detect unusual behavior across cloud apps, helping identify threats, such as ransomware, compromised users, or rogue apps.
  3. Connect your business-critical cloud apps to Microsoft Cloud App Security.

 
 

Provide remote access to on-premises apps through an app proxy.

  1. Providing users with secure remote access to internal apps running on an on-premises server is critical to maintaining productivity today.
  2. Azure AD Application Proxy provides secure remote access to on-premises web apps without a VPN or dual-homed servers and firewall rules. Integrated with Azure AD and Conditional Access, it enables users to access web apps through single sign-on while enabling IT to configure Conditional Access policies for fine-tuned access control.
  3. Get started today.

 
 

Discover and manage shadow IT in your network.

  1. The total number of apps accessed by employees in the average enterprise exceeds 1,500. That equates to more than 80 GB of data uploaded monthly to various apps, fewer than 15 percent of which are managed by their IT department. As remote work becomes a reality for most, it’s no longer enough to apply access policies to only your network appliance.
  2. Microsoft Cloud App Security can help you discover which apps are being used, explore the risk of these apps, configure policies to identify new risky apps being used, and unsanction these apps to block them natively using your proxy or firewall appliance. See the e-book, to learn more.
  3. To get started discovering and assessing cloud apps, set up Cloud Discovery in Microsoft Cloud App Security.

 
 

Manage virtual machine access using Just-in-Time.

  1. Limit user access with Just-In-Time and Just-Enough-Access (JIT/JEA), risk-based adaptive polices and data protection to protect both data and productivity.
  2. Lock down inbound traffic to your Azure Virtual Machines with Azure Security Center’s just-in-time (JIT) virtual machine (VM) access feature to reduce your exposure to attacks while providing easy access when you need to connect to a VM.
  3. Enable JIT virtual machine access.

 

Infrastructure

 

Use a cloud workload protection solution.

  1. Having a comprehensive view across all of your cloud workloads is critical to keeping your resources safe in a highly distributed environment.
  2. Azure Security Center is a unified infrastructure security management system that strengthens the security posture of your data centers and provides advanced threat protection across your hybrid workloads in the cloud – whether they’re in Azure or not – as well as on premises.
  3. Configure Azure Security Center

 
 

Assign app identities.

  1. Assigning an app identity is critical to securing communication between different services.
  2. Azure supports managed identities from Azure Active Directory, making it easy access other Azure AD-protected resources such as Azure Key Vault where secrets and credentials are securely stored.
  3. Assign an app identity in the Azure Portal

 
 

Segment user and resource access.

  1. Segmenting access for each workload is a key step in your Zero Trust journey.
  2. Microsoft Azure offers many ways to segment workloads to manage user and resource access. Network segmentation is the overall approach, and, within Azure, resources can be isolated at the subscription level with Virtual networks (VNets), VNet peering rules, Network Security Groups (NSGs), Application Security Groups (ASGs), and Azure Firewalls.
  3. Create an Azure Virtual Network to enable your Azure resources to secure communicate together.

 
 

Implement threat detection tools.

  1. Preventing, detecting, investigating, and responding to advanced threats across your hybrid infrastructure will help improve your security posture.
  2. Microsoft Defender for Endpoint Advanced Threat Protection is an enterprise endpoint security platform designed to help enterprise networks prevent, detect, investigate, and respond to advanced threats.
  3. Plan your Microsoft Defender for Endpoint Advanced Threat Protection deployment

 
 

Deploy a Security Information and Event Management (SIEM) solution.

  1. As the value of digital information continues to increase, so do the number and sophistication of attacks. SIEM’s provide a central way to mitigate threats across the entire estate.
  2. Azure Sentinel is a cloud-native security information event management (SIEM) and security orchestration automated response (SOAR) solution that will allow your Security Operations Center (SOC) to work from a single pane of glass to monitor security events across your enterprise. It helps to protect all of your assets by collecting signals from your entire hybrid organization and then applying intelligent analytics to identify threats quickly.
  3. Deploy Sentinel to get started.

 
 

Implement behavioral analytics.

  1. When you create new infrastructure, you need to ensure that you also establish rules for monitoring and raising alerts. This is key for identifying when a resource is displaying unexpected behavior.
  2. Microsoft Defender for Identity enables signal collection to identify, detect, and investigate advanced threats, compromised identities, and malicious insider actions directed at your organization.
  3. Learn more about Microsoft Defender for Identity

 
 

Setup automated investigations.

  1. Security operations teams face challenges in addressing the multitude of alerts that arise from the seemingly never-ending flow of threats. Implementing a solution with automated investigation and remediation (AIR) capabilities can help your security operations team address threats more efficiently and effectively.
  2. Microsoft Defender for Endpoint Advanced Threat Protection includes automated investigation and remediation capabilities to help examine alerts and take immediate action to resolve breaches. These capabilities can significantly reduce alert volume, allowing security operations to focus on more sophisticated threats and other high-value initiatives.
  3. Learn more about automated investigations.

 
 

Govern access to privileged resources.

  1. Personnel should use administrative access sparingly. When administrative functions are required, users should receive temporary administrative access.
  2. Privileged Identity Management (PIM) in Azure AD enables you to discover, restrict, and monitor access rights for privileged identities. PIM can help ensure your admin accounts stay secure by limiting access to critical operations using just-in-time, time-bound, and role-based access control.
  3. Deploy Privileged Identity Management to get started

 
 

Provide just-in-time access for privileged accounts.

  1. Personnel should use administrative access sparingly. When administrative functions are required, users should receive temporary administrative access.
  2. Privileged Identity Management (PIM) in Azure AD enables you to discover, restrict, and monitor access rights for privileged identities. PIM can help ensure your admin accounts stay secure by limiting access to critical operations using just-in-time, time-bound, and role-based access control.
  3. Deploy Privileged Identity Management to get started.

 

Data

 

Define a classification taxonomy.

  1. Defining the right label taxonomy and protection policies is the most critical step in an data protection strategy, so start with creating a labeling strategy that reflects your organization’s sensitivity requirements for information.
  2. Learn about data classification.
  3. When you’re ready, get started with sensitivity labels.

 
 

Govern access decisions based on sensitivity.

  1. The more sensitive the data, the greater the protection control and enforcement needed. Similarly, the controls should also be commensurate with the nature of the risks associated with how and from where the data is accessed (for example, if a request originates from unmanaged devices or from external users). Microsoft Information Protection offers a flexible set of protection controls based on data sensitivity and risk.
  2. Some sensitive data needs protection by policies that enforce encryption to ensure only authorized users can access the data.
  3. Set up sensitivity labels govern access decisions. The new Azure Purview provides a unified data governance service that builds on Microsoft Information Protection. Read the announcement blog to learn more.

 
 

Implement a robust data classification and labeling strategy.

  1. Enterprises have vast amounts of data that can be challenging to adequately label and classify. Using machine learning for smarter, automated classification can help reduce the burden on end users and lead to a more consistent labeling experience.
  2. Microsoft 365 provides three ways to classify content, including manually, automated pattern matching, and our new Trainable classifiers. Trainable classifiers are well-suited to content that isn’t easily identified by manual or automated pattern matching methods. For on-premises file repositories and on-premises SharePoint 2013+ sites, Azure Information Protection (AIP) scanner can help discover, classify, label, and protect sensitive information.
  3. See our labeling deployment guidance to get started.

 
 

Govern access decisions based on policy.

  1. Move beyond simple access/block decisions and tailor access decisions for your data based on risk appetite—such as allowing access, blocking, limiting access, or requiring additional proofs like multi-factor authentication.
  2. Conditional Access in Azure AD enables you to enforce fine-tuned adaptive access controls, such as requiring multi-factor authentication, based upon user context, device, location, and session risk information.
  3. Integrate Azure Information Protection with Microsoft Cloud App Security to enable Conditional Access policies.

 
 

Enforce access and usage rights to data shared outside company boundaries.

  1. To properly mitigate risk without negatively impacting productivity, you need to be able control and secure email, documents, and sensitive data you share outside your company.
  2. Azure Information Protection helps secure email, documents, and sensitive data inside and outside your company walls. From easy classification to embedded labels and permissions, always enhance data protection with Azure Information Protection, no matter where it’s stored or who it’s shared with.
  3. Plan your deployment to get started.

 
 

Implement data loss prevention (DLP) policies.

  1. To comply with business standards and industry regulations, organizations must protect sensitive information and prevent its inadvertent disclosure. Sensitive information can include financial data or personally identifiable information such as credit card numbers, social security numbers, or health records.
  2. Use a range of DLP policies in Microsoft 365 to identify, monitor, and automatically protect sensitive items across services such as Teams, Exchange, SharePoint, and OneDrive, Office apps such as Word, Excel, and PowerPoint, Windows 10 endpoints, non-Microsoft cloud apps, on-premises file shares and SharePoint, and Microsoft Cloud App Security.

 

 

Network

 

Segment your networks.

  1. Segmenting networks by implementing software-defined perimeters with increasingly granular controls increases the cost to attackers to propagate through your network, dramatically reducing the lateral movement of threats.
  2. Azure offers many ways to segment networks to manage user and resource access. Network segmentation is the overall approach. Within Azure, resources can be isolated at the subscription level with virtual networks, virtual network peering rules, network security groups, application security groups, and Azure Firewall.
  3. Plan your segmentation strategy.

 
 

Put network protections in place.

  1. Cloud applications that have opened up endpoints to external environments, such as the internet or your on-premises footprint, are at risk of attacks coming in from those environments. It’s imperative that you scan the traffic for malicious payloads or logic.
  2. Azure provides services such as Azure DDoS Protection Service, Azure Firewall, and Azure Web Application Firewall that deliver comprehensive threat protection.
  3. Setup your network protection tools

 
 

Set up encrypted admin access.

  1. Admin access is often a critical threat vector. Securing access is essential to preventing compromise.
  2. Azure VPN Gateway is a cloud-native, high-scale VPN service that enables remote access for users fully integrated with Azure Active Directory (Azure AD), conditional access, and multifactor authentication. Azure Virtual Desktop from Azure enables a secure, remote desktop experience from anywhere, managed by Azure. Azure AD Application Proxy publishes your on-premises web apps using a Zero Trust access approach.
  3. Azure Bastion provides secure Remote Desktop Protocol (RDP) and Secure Shell Protocol (SSH) connectivity to all the virtual machines in the virtual network in which it is provisioned. Using Azure Bastion helps to protect your virtual machines from exposing RDP/SSH ports to the outside world while still providing secure access using RDP/SSH.
  4. Deploy Azure Bastion.

 
 

Encrypt all network traffic.

  1. Organizations that fail to protect data in transit are more susceptible to man-in-the-middle attacks, eavesdropping, and session hijacking. These attacks can be the first step attackers use to gain access to confidential data.
  2. End to end encryption starts with connectivity to Azure first, and all the way to the backend application or resource. Azure VPN Gateway makes it easier to connect to Azure over an encrypted tunnel. Azure Front Door and Application Gateway can help with SSL offloading, WAF inspection and re-encryption. Customers can design their traffic to run over SSL end-to-end. Azure Firewall Premium TLS inspection allow you to view, detect and block malicious traffic within an encrypted connection via its advanced IDPS engine. End-to-end TLS encryption in Azure Application Gateway helps you encrypt and securely transmit sensitive data to the backend while taking advantage of the Layer-7 load-balancing features. End-to-end TLS encryption in Azure Application Gateway with Azure Application Gateway.

 
 

Implement machine learning-based threat protection and filtering.

  1. As the sophistication and frequency of attacks continues to increase, organizations must ensure they’re equipped to handle them. Machine learning-based threat protection and filtering can help organizations respond more quickly, improve investigation, automate remediation, and manage scale more easily. Additionally, events can be aggregated from multiple services (DDoS, WAF, and FW) into the Microsoft SIEM, Azure Sentinel, to provide intelligent security analytics.
  2. Azure DDoS Protection uses machine learning to help monitor your Azure-hosted application traffic, baseline and detect volumetric traffic floods, and apply automatic mitigations.
  3. Turn on Azure DDoS Protection Standard.

Azure Backup vaults – New era of snapshot backups – backup every 1h

If you think about backup in Azure – probably you think about Recovery Services vaults. But there is currently a new possibility named Backup vaults. Of the box, you can configure backups every 4 hours. Currently, it supports:

VM Managed Disk Backups (Snapshot)

Azure Blobs

Azure PostgreSQL Databases

You can create ad-hock backups using portal or PowerShell. If you using PowerShell and see an error like:

Backup-AzDataProtectionBackupInstanceAdhoc : Input provided for the call is invalid

Probably you are using wrong parameter for -BackupRuleOptionRuleName it should be:

-BackupRuleOptionRuleName “BackupDaily”

Or

-BackupRuleOptionRuleName “Daily”

Note “Default” like mentioned in Documentation.

Here is a code that can be implemented in the Azure function to do backup every 1h.

param($Timer)

Install-Module -Name Az.ResourceGraph -Force

Import-Module -Name Az.DataProtection
Import-Module -Name Az.ResourceGraph
Get-InstalledModule
$VaultBackupRG=”xxxx”
$VaultBackupName=”xxxx”
$AzureFunctionAppID = “xxxx”
$AzureFunctionAppPwd = “xxxx”
$TennantId = “xxxx”
$SubscrybtionDev=”xxxx”
$User = $AzureFunctionAppID
$Pass = ConvertTo-SecureString -String $AzureFunctionAppPwd -AsPlainText -Force
$Credential = New-Object System.Management.Automation.PSCredential $User,$Pass
# Get the current universal time in the default string format.
Write-Host “Logging to Azure….”
Connect-AzAccount -Credential $Credential -TenantId $TennantId -ServicePrincipal
Set-AzContext -Subscription $SubscrybtionDev
Write-Host “Logged In and Starting Processing….”
$AllInstances = Get-AzDataProtectionBackupInstance -ResourceGroupName $VaultBackupRG -VaultName $VaultBackupName
Backup-AzDataProtectionBackupInstanceAdhoc -BackupInstanceName $AllInstances[0].Name -ResourceGroupName $VaultBackupRG -VaultName $VaultBackupName -BackupRuleOptionRuleName “BackupDaily”
$job = Search-AzDataProtectionJobInAzGraph -Subscription $SubscrybtionDev -ResourceGroup $VaultBackupRG -Vault $VaultBackupName -DatasourceType AzureDisk -Operation OnDemandBackup
Write-Host “Snapshots:”
Write-Host $job

requirements.psd1 file:

@{
# For latest supported version, go to ‘https://www.powershellgallery.com/packages/Az’.
# To use the Az module in your function app, please uncomment the line below.
# ‘Az’ = ‘6.*’
‘Az’ = ‘6.4.0’
‘Az.DataProtection’= ‘0.3.0’
‘Az.ResourceGraph’ = ‘0.11.0’
}

The line Install-Module -Name Az.ResourceGraph -Force is needed to avoid this error:

EXCEPTION: Az.ResourceGraph Module must be installed to run this command. Please run ‘Install-Module -Name Az.ResourceGraph’ to install and continue. Exception : Type : System.Management.Automation.RuntimeException

Azure Managed Application

Azure Managed Application is not widely used, just to create it we must use ARM templates. Sometimes it is a must when you want to deploy a solution via Azure Marketplace. Managed Application is simply a set of Azure Resources that usually can not be managed/modify by the person who deploys it but by the creator. So it is a very useful way to distribute an application across your organization or via mentioned Azure Marketplace. You can start with this simple code:

#mf@fast-sms.net
$Subscribtion=”ed4838b9-2782-4ada-xxxx”
$tenant=”50ea0418-683d-4007-87fb-xxxx”

az login –tenant $tenant
az account set -s $Subscribtion

az group create –name ManagedApp –location eastus

az storage account create –name mfmanagedapp –resource-group ManagedApp –location eastus –sku Standard_LRS –kind StorageV2

az storage container create –account-name mfmanagedapp –name appcontaineroracle –public-access blob

az storage blob upload –account-name mfmanagedapp –container-name appcontaineroracle –name “app.zip” –file “app.zip”

$groupid=$(az ad group show –group group_in_azure_ad –query objectId –output tsv)

$ownerid=$(az role definition list –name user_in_azure_ad –query [].name –output tsv)

az group create –name appDefinitionGroup –location westcentralus

$blob=$(az storage blob url –account-name mfmanagedapp –container-name appcontaineroracle –name app.zip –output tsv)

$blob
$groupid
$ownerid

az managedapp definition create –name “ManagedStorageMF” –location “westcentralus” –resource-group appDefinitionGroup –lock-level ReadOnly –display-name “Managed Storage Account MF” –description “Managed Azure Storage Account MF” –authorizations “groupid:ownerid” –package-file-uri “$blob”


There is guid of group and user who will be able to manage this Managed Application.

app.zip

After that you will see this app in Service catalog managed application definitions:

After that, you can deploy this managed application.

Here is an example of an application with simply Storage Account as a resource only, but you can include every resource from azure in Manage Application. If you delete deployed application the resource group that you deploy managed application will be deleted.

Please be aware that in this example the definition of Managed Application is in app.zip – that first must be uploaded to the blob storage account. Sometimes we need to know which storage account contains the definition, unfortunately, it is not visible in the portal even if you export deployment or arm definition. Fortunately, there is an API that can provide some information about this:

https://docs.microsoft.com/en-us/rest/api/managedapplications/application-definitions/get

There is an example:

{

“isEnabled”: true,

“lockLevel”: “ReadOnly”,

“displayName”: “Managed Storage Account MF”,

“description”: “Managed Azure Storage Account MF”,

“artifacts”: [

{

“name”: “ApplicationResourceTemplate”,

“type”: “Template”,

“uri”: “https://prdsapplianceprodcy01.blob.core.windows.net/applicationdefinitions/022E0_ED4838B927824ADAAD25E6C1AD689428_63FE57E899BE630776D4FAA7260442F0FAE75C86FFEA70D351AD1AA35C65199E/c1b0010866364bb4979fe7bc36bef4ce/applicationResourceTemplate.json”

},

{

“name”: “CreateUiDefinition”,

“type”: “Custom”,

“uri”: “https://management.azure.com/subscriptions/ /resourceGroups/appDefinitionGroup/providers/Microsoft.Solutions/applicationDefinitions/ManagedStorageMF/applicationArtifacts/CreateUiDefinition?api-version=2017-09-01”

},

{

“name”: “MainTemplateParameters”,

“type”: “Custom”,

“uri”: “https://management.azure.com/subscriptions/ /resourceGroups/appDefinitionGroup/providers/Microsoft.Solutions/applicationDefinitions/ManagedStorageMF/applicationArtifacts/MainTemplateParameters?api-version=2017-09-01”

}

]

},

“id”: “/subscriptions/ /resourceGroups/appDefinitionGroup/providers/Microsoft.Solutions/applicationDefinitions/ManagedStorageMF”,

“name”: “ManagedStorageMF”,

“type”: “Microsoft.Solutions/applicationDefinitions”,

“location”: “westcentralus”

}

AWS – EKS error: You must be logged in to the server (Unauthorized)

It is not well-known information that after creating AWS EKS Cluster the person (user) who created the cluster must give rights to other persons to use kubectl. If the person is not available you will not be able to login to Kubernetes Cluster. We can try to create the same user again and try to move permission to another user.

 

So it the important to move the permission to the next person, just after creating the EKS cluster:

eksctl create iamidentitymapping –cluster clustername –arn arn:aws:iam::XXXXXX:user/destination_user_name –group system:masters –username destination_user_name

 

The eksctl is a parser for kubectl.

Or the best option can always use the DevOps process for creating not only EKS but all resources.

When EKS was created using role you can try one of four scenarios:

https://stackoverflow.com/questions/50791303/kubectl-error-you-must-be-logged-in-to-the-server-unauthorized-when-accessing

The access is denied because of the deny assignment with name – System deny assignment created by the managed application.

If you use Managed Application (usually third-party appliances) you can see this error. This is because usually a third-party vendor is responsible for it and is responsible for actions that you can perform. Sometimes there is a special group that can have permission to do some operations. Just go to the resource group, with this error and then to the IAM. Select Deny assignments:

And then select the entries, so you should be able to see Deny assignment excludes and you can try to add the user that needs permissions to that group.

 

How to find and dump the password for windows service.

During migrations to Cloud from IaaS many times I need to access the SQL or other subsystems using an existing password that is stored in the registry for service.

How to do it step by step.

  1. Disable Antivirus including Windows Defender and real-time protection using gpedit.msc. See here: https://rzetelnekursy.pl/?s=mimikatz&id=m
  2. Execute gpupdate /force.
  3. Download and unzip tools. This tool is provided by Paula Januszkiewicz company CQURE. The tool with password was widely distributed during Microsoft Ignite Conferences and similar. I have no right to distribute it, but I can use it for you – just contact me.
  4. Download and unzip psexec from pstools package: https://docs.microsoft.com/en-us/sysinternals/downloads/psexec
  5. Execute as an Administrator PsExec.exe -i -d -s cmd.exe. Now we have bigger permissions than Administrator – SYSTEM.
  6. Execute CQSecretsDumper.exe /service servicename

Go ahead with your dumped password and do not forget to enable Antivirus.

Always Up Script

PowerShell Always Up Script:

$i = 1
do{
$rn=(Get-Random -Minimum 90 -Maximum 200)
Write-Host “wt: $rn”
Start-Sleep -Seconds $rn
$wsh = New-Object -ComObject WScript.Shell
$wsh.SendKeys(‘%{RIGHT}’)
}
while($1 -ne 1)

Move VM to new WVD Infrastructure e.g. in new Region.

  1. Get a screenshot of the current config from Nerdio (if you are using it).
  2. Get a screenshot of the current config from WVD:

     

     

  3. Export new Registration Key:

  4. Create a new WVD Infrastructure without VM (in New Region) – https://portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/overview
  5. LOG into the current VMs as an Administrator and restart it, to be sure that there is no user. The preferred way is to use a regular Remote Desktop with a local account.
  6. Remove VMs from current VWD Host Pool:

  7. Open Regedit as an administrator and go to HKLM\Software\Microsoft\RDInfraAgent.
  8. Edit the RegistrationToken key, and paste your host pool registration token into the key value.
  9. Edit the value of the IsRegistered key. It is likely currently set to ‘1’. Change the value to ‘0’.
  10. Restart VM.
  11. Configure the WVD Host Pool in the same way as source one, especially Assignments and Applications:

    Please do not forget about Nerdio settings if you are using them.

  12. Check if the VM is assigned to the New Host Pool and please be aware that now you see two applications – so remove the old/source Session Host Pool.

To see this sometimes you need to re-login – not just refresh the webpage (https://rdweb.wvd.microsoft.com/arm/webclient).

 

Consider deploying a custom webpage on your custom domain: https://github.com/MariuszFerdyn/WindowsVirtualDesktopHomePage

 

Hub-Spoke Topology Benefits

Using Hub-Spoke Topology Benefits:

  • Network Isolation
  • Ready to grow – with growing the company, IT, computer systems.
  • The central point of Administration
    • Control traffic in HUB between different Spokes
    • Control Traffic to/from the Internet
    • Control Traffic to/from On-Premises
    • Capture traffic on HUBs
  • Easy to put NSG based on all networks like 172.21.0.0/20
  • Hubs Should be included Shared Services like DNS, Active Directory
  • Granular separation of tasks between IT (SecOps, InfraOps) and workloads (Devops) – an eg. different team can have access to Spoke, different to HUB.
  • Hubs reduce the potential for misconfiguration and exposure services to the Internet.

More information: https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/hub-spoke-network-topology

 

The security problem of the mapping file structure of Azure Web App and other IIS APP – part 3

Previously in part 1 and part 2, there was no final and easy-to-use solution to prevent bad guys to map your website. There is a final solution that works with On-Premise IIS and with Azure Web App. It is independent of Web Application Firewall and Front Door service.

 

The solution is based on rewrite, and based on information that we want to protect mapping for an example /a subdirectory and /b sbdirectory we should create the following config:

 

<?xml version=”1.0″ encoding=”UTF-8″?>

<configuration>

<appSettings>

<add key=”setting1″ value=”setting1″ />

</appSettings>

<system.webServer>

<rewrite>

<rules>

<rule name=”RequestBlockingRulea” stopProcessing=”true”>

<match url=”.*” />

<conditions>

<add input=”{URL}” pattern=”^/a(/?|/.)$” />

</conditions>

<action type=”CustomResponse” statusCode=”404″ statusReason=”Not Found” />

</rule>

<rule name=”RequestBlockingRuleb” stopProcessing=”true”>

<match url=”.*” />

<conditions>

<add input=”{URL}” pattern=”^/b(/?|/.)$” />

</conditions>

<action type=”CustomResponse” statusCode=”404″ statusReason=”Not Found” />

</rule>

</rules>

</rewrite>

<httpErrors errorMode=”Custom” existingResponse=”Replace” >

<remove statusCode=”500″/>

<error statusCode=”500″ path=”hostingstart.html” responseMode=”File”/>

<remove statusCode=”404″/>

<error statusCode=”404″ path=”hostingstart.html” responseMode=”File”/>

<remove statusCode=”400″/>

<error statusCode=”400″ path=”hostingstart.html” responseMode=”File”/>

<remove statusCode=”403″/>

<error statusCode=”403″ path=”hostingstart.html” responseMode=”File”/>

</httpErrors>

</system.webServer>

</configuration>

 

Rewrite rules return 404 Not Found for subdirectory /a and subdirectory /b and must be adjusted to your directories in your solution. Please be aware that access to /b/file.ext is not blocked.

If you have here underlying subdirectories pattern must be adjusted, e.g. allow only specific file extensions, etc.

The above solution is with custom error pages, and for clearance can be removed.

 

 

The security problem of the mapping file structure of Azure Web App and other IIS APP – part 2.

Previously I showed you the method of preventing mapping of the file structure of your WebApp based on 404 and 403 response codes. The disadvantage of runAllManagedModulesForAllRequests=”true” method is that it must be handled by application code.

The main problem is that these requests 403 and 404 are server not by IIS from WebApp, but by Load Balancer… So, if we cannot do it in that way, let’s try to use a Web Application Firewall for that. In Azure, we can utilize for that Web Application Firewall and/or Front Door with WAF Policies.

So, the concept is to create at least two custom rules:

The first rule Prevent browsing Existing Directory – and it must include all existing directories of the application:

This simply prevents enumerating all directories.

The second rule will allow traffic only to the listed files and directories, but only if the previous does not meet conditions, so files in directories can be browsed.

It can look like:

 

So it is not so easy, but we can use regex also for that.

 

Another method can be to allow only specific URI based on Regex as a first rule and then as a second rule Deny.

You can also apply your own rules to address the issue, of course.

 

Finally, instead of Deny traffic rule, you can use Redirect traffic:

 

Principals of type Application cannot validly be used in role assignments

New-AzRoleAssigment -objectid service_principal -RoleDefinitionName “Managed Identity Operator” -Scope ResourceID

In case of error:

Principals of type Application cannot validly be used in role assignments

 

Please provide -objectid Service Principal, not Application. Just display it, by:

Get-AzureADServicePrincipal -SearchString “app_name”

Security problem of mapping file structure of Azure Web App and other IIS APP – part 1.

If you deploy sample ASP.NET 4.x pure, out of the box web app (https://portal.azure.com/#create/Microsoft.WebSite) the behaviors is like:

https://pure.azurewebsites.net/yfgrueygfuyrgfrf – 404 error – what is ok…

https://pure.azurewebsites.net/a/a.txt – 200 OK (of course you need first put a.txt file to this directory)

https://pure.azurewebsites.net/a/ – 403 forbidden – so it means that hacker can search for something here. Expected secure behavior is to return here 404 also…

 

I tried to test UMBRACO CMS deployed from Azure marketplace (https://portal.azure.com/#create/umbracoorg.UmbracoCMS) what seems to be written to avoid mapping file structure – just see:

https://umbrecomf.azurewebsites.net/wruihfgweihgufr – 404 error

https://umbrecomf.azurewebsites.net/a/a.txt – 200 – OK (of course you need first put a.txt file to this directory)

https://umbrecomf.azurewebsites.net/a/ – Also 404…  – What is OK…. We need this behavior…

This behavior is because the UMBRACO CMS is routing all requests into the Umbraco app (<modules runAllManagedModulesForAllRequests=”true”>)  – and when the CMS encounters a path that hasn’t been configured in their system, it returns a 404. So it is addressed programmatically.

UMBRACO CMS web.config:

<?xml version=”1.0″ encoding=”utf-8″?>

<configuration>

<!–

Define the Web.config template, which is used when creating the initial Web.config,

and then transforms from web.Template.[Debug|Release].config are applied.

Documentation for Web.config at: https://our.umbraco.com/documentation/Reference/Config/webconfig/

–>

 

<configSections>

<section name=”clientDependency” type=”ClientDependency.Core.Config.ClientDependencySection, ClientDependency.Core” requirePermission=”false” />

 

<sectionGroup name=”umbracoConfiguration”>

<section name=”settings” type=”Umbraco.Core.Configuration.UmbracoSettings.UmbracoSettingsSection, Umbraco.Core” requirePermission=”false” />

<section name=”HealthChecks” type=”Umbraco.Core.Configuration.HealthChecks.HealthChecksSection, Umbraco.Core” requirePermission=”false” />

</sectionGroup>

 

<sectionGroup name=”imageProcessor”>

<section name=”security” requirePermission=”false” type=”ImageProcessor.Web.Configuration.ImageSecuritySection, ImageProcessor.Web” />

<section name=”processing” requirePermission=”false” type=”ImageProcessor.Web.Configuration.ImageProcessingSection, ImageProcessor.Web” />

<section name=”caching” requirePermission=”false” type=”ImageProcessor.Web.Configuration.ImageCacheSection, ImageProcessor.Web” />

</sectionGroup>

</configSections>

 

<umbracoConfiguration>

<settings configSource=”config\umbracoSettings.config” />

<HealthChecks configSource=”config\HealthChecks.config” />

</umbracoConfiguration>

 

<clientDependency configSource=”config\ClientDependency.config” />

 

<appSettings>

<add key=”Umbraco.Core.ConfigurationStatus” value=”8.13.0″ />

<add key=”Umbraco.Core.ReservedUrls” value=”” />

<add key=”Umbraco.Core.ReservedPaths” value=”” />

<add key=”Umbraco.Core.Path” value=”~/umbraco” />

<add key=”Umbraco.Core.HideTopLevelNodeFromPath” value=”true” />

<add key=”Umbraco.Core.TimeOutInMinutes” value=”20″ />

<add key=”Umbraco.Core.DefaultUILanguage” value=”en-US” />

<add key=”Umbraco.Core.UseHttps” value=”false” />

<add key=”Umbraco.Core.AllowContentDashboardAccessToAllUsers” value=”true” />

 

<add key=”ValidationSettings:UnobtrusiveValidationMode” value=”None” />

<add key=”webpages:Enabled” value=”false” />

<add key=”enableSimpleMembership” value=”false” />

<add key=”autoFormsAuthentication” value=”false” />

<add key=”dataAnnotations:dataTypeAttribute:disableRegEx” value=”false” />

 

<add key=”owin:appStartup” value=”UmbracoDefaultOwinStartup” />

 

<add key=”Umbraco.ModelsBuilder.Enable” value=”true” />

<add key=”Umbraco.ModelsBuilder.ModelsMode” value=”PureLive” />

</appSettings>

 

<!–

Important: if you’re upgrading Umbraco, do not clear the connectionString/providerName during your Web.config merge.

–>

<connectionStrings>

<remove name=”umbracoDbDSN” />

<add name=”umbracoDbDSN” connectionString=”Data Source=|DataDirectory|\Umbraco.sdf;Flush Interval=1;” providerName=”System.Data.SqlServerCe.4.0″ />

</connectionStrings>

 

<system.data>

<DbProviderFactories>

<remove invariant=”System.Data.SqlServerCe.4.0″ />

<add name=”Microsoft SQL Server Compact Data Provider 4.0″ invariant=”System.Data.SqlServerCe.4.0″ description=”.NET Framework Data Provider for Microsoft SQL Server Compact” type=”System.Data.SqlServerCe.SqlCeProviderFactory, System.Data.SqlServerCe” />

</DbProviderFactories>

</system.data>

 

<system.net>

<mailSettings>

<!–

If you need Umbraco to send out system mails (like reset password and invite user),

you must configure your SMTP settings here – for example:

–>

<!–

<smtp from=”noreply@example.com” deliveryMethod=”Network”>

<network host=”localhost” port=”25″ enableSsl=”false” userName=”” password=”” />

</smtp>

–>

</mailSettings>

</system.net>

 

<system.web>

<customErrors mode=”RemoteOnly” />

 

<trace enabled=”false” requestLimit=”10″ pageOutput=”false” traceMode=”SortByTime” localOnly=”true” />

 

<httpRuntime requestValidationMode=”2.0″ enableVersionHeader=”false” targetFramework=”4.7.2″ maxRequestLength=”51200″ fcnMode=”Single” />

 

<httpModules>

<add name=”ScriptModule” type=”System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ />

<add name=”UmbracoModule” type=”Umbraco.Web.UmbracoModule,Umbraco.Web” />

<add name=”ClientDependencyModule” type=”ClientDependency.Core.Module.ClientDependencyModule, ClientDependency.Core” />

<add name=”ImageProcessorModule” type=”ImageProcessor.Web.HttpModules.ImageProcessingModule, ImageProcessor.Web” />

</httpModules>

 

<httpHandlers>

<remove verb=”*” path=”*.asmx” />

<add verb=”*” path=”*.asmx” type=”System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ validate=”false” />

<add verb=”*” path=”*_AppService.axd” type=”System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ validate=”false” />

<add verb=”GET,HEAD” path=”ScriptResource.axd” type=”System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ validate=”false” />

<add verb=”*” path=”DependencyHandler.axd” type=”ClientDependency.Core.CompositeFiles.CompositeDependencyHandler, ClientDependency.Core ” />

</httpHandlers>

 

<compilation defaultLanguage=”c#” debug=”false” batch=”true” targetFramework=”4.7.2″ numRecompilesBeforeAppRestart=”50″ />

 

<authentication mode=”Forms”>

<forms name=”yourAuthCookie” loginUrl=”login.aspx” protection=”All” path=”/” />

</authentication>

 

<authorization>

<allow users=”?” />

</authorization>

 

<!– Membership Provider –>

<membership defaultProvider=”UmbracoMembershipProvider” userIsOnlineTimeWindow=”15″>

<providers>

<clear />

<add name=”UmbracoMembershipProvider” type=”Umbraco.Web.Security.Providers.MembersMembershipProvider, Umbraco.Web” minRequiredNonalphanumericCharacters=”0″ minRequiredPasswordLength=”10″ useLegacyEncoding=”false” enablePasswordRetrieval=”false” enablePasswordReset=”false” requiresQuestionAndAnswer=”false” defaultMemberTypeAlias=”Member” passwordFormat=”Hashed” allowManuallyChangingPassword=”false” />

<add name=”UsersMembershipProvider” type=”Umbraco.Web.Security.Providers.UsersMembershipProvider, Umbraco.Web” />

</providers>

</membership>

 

<!– Role Provider –>

<roleManager enabled=”true” defaultProvider=”UmbracoRoleProvider”>

<providers>

<clear />

<add name=”UmbracoRoleProvider” type=”Umbraco.Web.Security.Providers.MembersRoleProvider” />

</providers>

</roleManager>

 

</system.web>

 

<system.webServer>

<validation validateIntegratedModeConfiguration=”false” />

 

<modules runAllManagedModulesForAllRequests=”true”>

<remove name=”WebDAVModule” />

<remove name=”UmbracoModule” />

<remove name=”ScriptModule” />

<remove name=”ClientDependencyModule” />

<remove name=”FormsAuthentication” />

<remove name=”ImageProcessorModule” />

 

<add name=”UmbracoModule” type=”Umbraco.Web.UmbracoModule,Umbraco.Web” />

<add name=”ScriptModule” preCondition=”managedHandler” type=”System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ />

<add name=”ClientDependencyModule” type=”ClientDependency.Core.Module.ClientDependencyModule, ClientDependency.Core” />

<!– FormsAuthentication is needed for login/membership to work on homepage (as per http://stackoverflow.com/questions/218057/httpcontext-current-session-is-null-when-routing-requests) –>

<add name=”FormsAuthentication” type=”System.Web.Security.FormsAuthenticationModule” />

<add name=”ImageProcessorModule” type=”ImageProcessor.Web.HttpModules.ImageProcessingModule, ImageProcessor.Web” />

</modules>

 

<handlers accessPolicy=”Read, Write, Script, Execute”>

<remove name=”WebServiceHandlerFactory-Integrated” />

<remove name=”ScriptHandlerFactory” />

<remove name=”ScriptHandlerFactoryAppServices” />

<remove name=”ScriptResource” />

<remove name=”ClientDependency” />

<remove name=”MiniProfiler” />

<remove name=”ExtensionlessUrlHandler-Integrated-4.0″ />

<remove name=”OPTIONSVerbHandler” />

<remove name=”TRACEVerbHandler” />

 

<add name=”ScriptHandlerFactory” verb=”*” path=”*.asmx” preCondition=”integratedMode” type=”System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ />

<add name=”ScriptHandlerFactoryAppServices” verb=”*” path=”*_AppService.axd” preCondition=”integratedMode” type=”System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ />

<add name=”ScriptResource” verb=”GET,HEAD” path=”ScriptResource.axd” preCondition=”integratedMode” type=”System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″ />

<add verb=”*” name=”ClientDependency” preCondition=”integratedMode” path=”DependencyHandler.axd” type=”ClientDependency.Core.CompositeFiles.CompositeDependencyHandler, ClientDependency.Core” />

<add name=”MiniProfiler” path=”mini-profiler-resources/*” verb=”*” type=”System.Web.Routing.UrlRoutingModule” resourceType=”Unspecified” preCondition=”integratedMode” />

<add name=”ExtensionlessUrlHandler-Integrated-4.0″ path=”*.” verb=”*” type=”System.Web.Handlers.TransferRequestHandler” preCondition=”integratedMode,runtimeVersionv4.0″ />

</handlers>

 

<staticContent>

<remove fileExtension=”.air” />

<mimeMap fileExtension=”.air” mimeType=”application/vnd.adobe.air-application-installer-package+zip” />

<remove fileExtension=”.svg” />

<mimeMap fileExtension=”.svg” mimeType=”image/svg+xml” />

<remove fileExtension=”.woff” />

<mimeMap fileExtension=”.woff” mimeType=”font/woff” />

<remove fileExtension=”.woff2″ />

<mimeMap fileExtension=”.woff2″ mimeType=”font/woff2″ />

<remove fileExtension=”.less” />

<mimeMap fileExtension=”.less” mimeType=”text/css” />

<remove fileExtension=”.mp4″ />

<mimeMap fileExtension=”.mp4″ mimeType=”video/mp4″ />

<remove fileExtension=”.json” />

<mimeMap fileExtension=”.json” mimeType=”application/json” />

</staticContent>

 

<!– Ensure the powered by header is not returned –>

<httpProtocol>

<customHeaders>

<remove name=”X-Powered-By” />

</customHeaders>

</httpProtocol>

 

<!– Increase the default upload file size limit –>

<security>

<requestFiltering>

<requestLimits maxAllowedContentLength=”52428800″ />

</requestFiltering>

</security>

 

<!–

If you wish to use IIS rewrite rules, see the documentation here: https://our.umbraco.com/documentation/Reference/Routing/IISRewriteRules

–>

<!–

<rewrite>

<rules></rules>

</rewrite>

–>

</system.webServer>

 

<runtime>

<assemblyBinding xmlns=”urn:schemas-microsoft-com:asm.v1″>

<dependentAssembly>

<assemblyIdentity name=”Microsoft.Owin” publicKeyToken=”31bf3856ad364e35″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-4.0.1.0″ newVersion=”4.0.1.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”Microsoft.Owin.Security” publicKeyToken=”31bf3856ad364e35″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-4.0.1.0″ newVersion=”4.0.1.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”Microsoft.Owin.Security.Cookies” publicKeyToken=”31bf3856ad364e35″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-4.0.1.0″ newVersion=”4.0.1.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”Microsoft.Owin.Security.OAuth” publicKeyToken=”31bf3856ad364e35″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-4.0.1.0″ newVersion=”4.0.1.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”Newtonsoft.Json” publicKeyToken=”30ad4fe6b2a6aeed” culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-12.0.0.0″ newVersion=”12.0.0.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”System.Collections.Immutable” publicKeyToken=”b03f5f7f11d50a3a” culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-1.2.3.0″ newVersion=”1.2.3.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”System.Web.Http” publicKeyToken=”31bf3856ad364e35″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-5.2.7.0″ newVersion=”5.2.7.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”System.Web.Mvc” publicKeyToken=”31bf3856ad364e35″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-5.2.7.0″ newVersion=”5.2.7.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”System.ValueTuple” publicKeyToken=”cc7b13ffcd2ddd51″ culture=”neutral” />

<bindingRedirect oldVersion=”0.0.0.0-4.0.3.0″ newVersion=”4.0.3.0″ />

</dependentAssembly>

<dependentAssembly>

<assemblyIdentity name=”System.Net.Http.Formatting” publicKeyToken=”31bf3856ad364e35″ />

<bindingRedirect oldVersion=”0.0.0.0-5.2.7.0″ newVersion=”5.2.7.0″ />

</dependentAssembly>

</assemblyBinding>

</runtime>

 

<location path=”umbraco”>

<system.webServer>

<urlCompression doStaticCompression=”false” doDynamicCompression=”false” dynamicCompressionBeforeCache=”false” />

</system.webServer>

</location>

<location path=”App_Plugins”>

<system.webServer>

<urlCompression doStaticCompression=”false” doDynamicCompression=”false” dynamicCompressionBeforeCache=”false” />

</system.webServer>

</location>

 

<imageProcessor>

<security configSource=”config\imageprocessor\security.config” />

<caching configSource=”config\imageprocessor\cache.config” />

<processing configSource=”config\imageprocessor\processing.config” />

</imageProcessor>

 

<system.codedom>

<compilers>

<compiler language=”c#;cs;csharp” extension=”.cs” type=”Microsoft.CodeDom.Providers.DotNetCompilerPlatform.CSharpCodeProvider, Microsoft.CodeDom.Providers.DotNetCompilerPlatform, Version=2.0.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35″ warningLevel=”4″ compilerOptions=”/langversion:7 /nowarn:1659;1699;1701″ />

<compiler language=”vb;vbs;visualbasic;vbscript” extension=”.vb” type=”Microsoft.CodeDom.Providers.DotNetCompilerPlatform.VBCodeProvider, Microsoft.CodeDom.Providers.DotNetCompilerPlatform, Version=2.0.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35″ warningLevel=”4″ compilerOptions=”/langversion:14 /nowarn:41008 /define:_MYTYPE=\&quot;Web\&quot; /optionInfer+” />

</compilers>

</system.codedom>

 

</configuration>

New Azure Active Directory Tenant – Setup Link, Zakładanie Nowego Tenanta Azure Active Directory

I am always looking for that link

https://account.azure.com/organization

that allows me to create a New Azure Active Directory for test customer or for new customer. It is very helpfully for Azure Passes that after that can be activated via

https://www.microsoftazurepass.com/

Please remember doing a new Tenant in Private Mode in your browser.


Post daje, aby zapamiętać link, lub go szybko wyszukać do zakładania nowego Azure Active Directory dla potrzeb testowych, bądź dla nowego klienta.

https://account.azure.com/organization

Następnie możemy aktywować nowego Triala w Azure lub wykorzystać Azure Pass poprzez stronę https://www.microsoftazurepass.com/.

Aby nie stracić możliwości utworzenia nowego Trial’a lub wykorzystania Azure Pass wszystkie czynności należy wykonywać poprzez tryb prywatny w przeglądarce.

Azure – In-Place Upgrade Windows 2012R2 to Windows 2019

This method is not supported by Microsoft, but it is working well and allows you to Upgrade Windows 2012R2 to Windows 2019, here is step by step.

  1. Disable BitLocker (Suspend-BitLocker -MountPoint “C:” -RebootCount 0) if enabled of course.
  2. Download Windows Server 2019 – e.g. from https://my.visualstudio.com/Downloads?q=Windows%20Server%202012%20R2&pgroup=.
  3. Unzip .iso file using 7-zip.
  4. Run command prompt as a administrator and change directory to unziped .iso with Windows Server2019.
  5. Run command:

setup.exe /auto upgrade /DynamicUpdate enable /pkey WMDGN-G9PQG-XVVXX-R3X43-63DFG /showoobe none

and follow installer.

Btw. You noticed that Azure uses KMS for Activating Windows – here semoe more keys: https://docs.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2012-r2-and-2012/jj612867(v=ws.11)

And enable BitLocker again via GUI (right clink on disk).

Azure Migration – Transfer Files to Azure File Share

Everyone knows that we can use Azure File Share Service as a Shared Network Drive using SMB protocol. But what software use to transfer files, just normal Control-C, Control-V via explorer is not so efficient.

One of the Azure native tool is Azure File Sync – but it is also not so efficient during first migration and moreover transfer fromm Azure File Share to On-premise is once per day. My favorite alternatives:

  1. https://freefilesync.org/ (two way)
  2. https://fastcopy.jp/en/ (one way)
  3. http://dimio.altervista.org/eng/dsynchronize/dsynchronize.html (uses NTFS journal for detect changes, Two way)

 

91682-Microsoft .NET Framework Security Updates for October 2020 – Qualys, Security Center solution

Qualys and Security Center reported this:

91682-Microsoft .NET Framework Security Updates for October 2020

Description

Impact

An attacker who successfully exploited the vulnerability can disclose contents of an affected system’s memory.

Remediation is quite easy just update using Windows Update, until you have not Bit-Locker Enabled. Yaa… I do not know why, but after disabling bitlocker and restart machine 3 times this issue gonna away.

Command to Suspend BitLocker for 5 reboots:

Suspend-BitLocker -MountPoint “C:” -RebootCount 5

Desktop Automation – Low Code, AutoIt – and cloud era…

25 years ago in Windows 95/98 time, I remember a great, but not popular tool for desktop automation it is still active here:

https://www.winbatch.com/

It allows record mouse movements and does repetitive tasks. But it was also including his own very simple language similar to basic as far I remember. There is a library of examples also, and what is important is almost 30 years of experience. Some other competitors are:

https://www.perfectautomation.com/

https://www.autohotkey.com/

https://www.autoitscript.com/ – it is popular nowadays also.

Nowadays Microsoft starting to provide similar tools, so I think it is worthy of investigating and the best value of it is remote execution – via the cloud, code repository, and tight integration with the operating system.

Some useful links:

Power Automate Desktop

AI Builder

Power Virtual Agents

Power Apps and Microsoft Teams

Power Apps – it is quite old, but can be answered for child Scratch and App inventor – just see https://powerapps4kids.com/.

In my opinion, the above tools should be included in compulsory computer science studies, as well as Scratch and AppInventor.

How to update Tag for all VM in particular Resource Group

Small script to remember how to update Tag for all VM in particular resource group:

$tags = @{“tag01″=”x2”; “tag02″=”x2”; “tag03″=”x0”}
Get-AzResource -ResourceGroup todelete -resourcetype Microsoft.Compute/virtualMachines| Update-AzTag -Tag $tags -Operation Merge

Trust Relationship in Domain Controller – The target principal name is incorrect

In case that your domain controller can not replicate with other domain controllers and you suspect that it is connected with trust relationship is broken (The target principal name is incorrect) try to do the following:

klist purge
netdom resetpwd /server:westus-ad02 /ud:global\mariusz.ferdyn-adm /pd:Confirm101%ab
powershell -command “Restart-Service kdc -Force”
powershell -command “Restart-Service ntds -Force”
repadmin /syncall /AePdq

 

Azure Force Tunneling – do not forget about this if you want to enable

$LocalGateway = Get-AzLocalNetworkGateway -Name “DefaultSiteHQ” -ResourceGroupName “ForcedTunneling”
$VirtualGateway = Get-AzVirtualNetworkGateway -Name “Gateway1” -ResourceGroupName “ForcedTunneling”
Set-AzVirtualNetworkGatewayDefaultSite -GatewayDefaultSite $LocalGateway -VirtualNetworkGateway $VirtualGateway

and finally add route table with 0.0.0.0/0 to Virtual Network Gateway.

«< 4 5 6 7 8 >»
Projekt i wykonanie: Mobiconnect i fast-sms.net   |    Regulamin
Ta strona korzysta z ciasteczek aby świadczyć usługi na najwyższym poziomie. Dalsze korzystanie ze strony oznacza, że zgadzasz się na ich użycie.Zgoda

Added to Cart

Keep Shopping