Use Windows 10’s individual display scaling for amazing your multi-monitor setup

One big plus with Windows 10 could it be has lots of small but essential tweaks for power users. For this tutorial, we’re viewing a great selection for anyone starting a multi-monitor setup, especially for folks rocking 4K monitors.

Windows 10, like Windows 8.1, includes a course of action to adjust the dpi (dots per inch) scaling on the per-monitor basis having a percentage scale. This wonderful tool will provide you with more granular control when you are using monitors of varying resolutions or monitors using the same resolution but different screen sizes. This is often far much better applying a single one DPI scaling percentage to all of of your monitors-which might lead to nasty sizing wonkiness-as older versions of Windows did.

Per-monitor action

To start learning, right-click any empty space on your hard drive and select Display settings in the bottoom of the context menu. Alternatively, you can check out Start > Settings > System > Display.

Once you’re there, you’ve won half the battle. You should see a graphical layout from your monitor set-up. Web page . I have a laptop display labeled 1 together with external monitor labeled 2.

Reduce the screen under Scale and layout you’ll see a drop-down menu labeled Customize size of text, apps, along with other items: X% (Recommended). It once was a slider in earlier versions of Windows 10, but because it feature defaults to particular scaling pre-sets, a decrease made more sense. Before beginning changing the scaling, make sure which monitor is highlighted.

In this situation, the laptop monitor is highlighted in blue, nonetheless don’t want to modify the scaling because of this display-the 1920-by-1080 view automatically scaled to 125% will be fine. The larger 1080p monitor, however, will be easier for my tired eyes to with a little scaling applied.

All Really do is click on monitor 2 throughout the Settings screen, and be able to choose my scaling as seen above. The pre-set methods my 1080p monitor were 100, 125, 150, and 175 percent. Some displays often come with different scaling options. I get a laptop with a 1366×768 display that maxes out at 125 percent, by way of example. Your mileage will be different depending on the resolution of your respective monitors.

Once more we’ve applied different scaling settings we have to do a little clean-up. Then, Windows 10 may tell you you need to sign-out and funding again before some apps on your hard drive respond to this latest scaling settings.

Muscle building first viewed this feature inside 2015, you definitely was going to follow Microsoft’s advice asap. Now that we’re deeper within the age of 4K, however, it is not as necessary as countless developers are packaging high-dpi assets in his or her programs.

Still, when you have a lot of older programs that wont get updated much, logging into websites and out again is advised.

Once you’re back into, we’ve got an alternative scaling setting we use. Open the Settings app again to System > Display. Inside the given scaling drop-down, choose the Advanced scaling settings link.

Intended to take you for the second page. Here, start the fans . Let Windows look to fix apps so they’re not blurry by clicking the slider. This certainly will let Windows try scale up any apps which aren’t making the cut.

Information for power users

Power users with multiple monitors about the same resolution want more refined scaling when compared to the pre-sets can offer. They often return to the advanced settings screen under Custom scaling (pictured above). Here you will enter a share between 100 and 500 in order to a blanket scaling to every your monitors.

SharePoint Server 2019 Preview Currently available

Microsoft on Tuesday announced a public preview details reveals SharePoint Server 2019.

In addition, there’s a public preview of Project Server 2019, and that is essentially part of SharePoint Server 2019 numerous gets licensed separately. The SharePoint Server 2019 preview is modeled on your current SharePoint Server 2016 product, nevertheless it really adopts some SharePoint Online product features in the process.

The preview is offred here. Microsoft wants to release the SharePoint Server 2019 product sometime this fall, dependant on slide 30 throughout this Microsoft slide deck out from the May SharePoint Conference Canada and america event.

Modern Experiences
For people, Microsoft has ported a few its SharePoint Online “modern” experiences to the site SharePoint Server 2019. End users by using the modern experience go for a different slot compared with the older so-called “classic” experience. Modern interfaces that demonstrate to up in SharePoint Server 2019 preview include:

Communication Sites
Team Sites
Lists
Libraries
Web Parts
Home Page
OneDrive next-generation sync client

Missing using list are Hub Sites, which may be used in SharePoint Online to handle Communication Sites and Team Sites. Hub Sites apparently aren’t going to be available in the SharePoint Server 2019 product even so feature is accessible in the Office 365 SharePoint Online product version.

Microsoft has published an important comparison between SharePoint Server 2019 preview having a modern graphical user interface additions as well current SharePoint Server 2016 product boasting classic key pad in this support document. The document noted that we now have differences backward and forward interfaces, and also features don’t map on one-to-one basis. What’s more, it seems that SharePoint Server 2019 users might get some classic interfaces.

“Existing team site home pages together with site pages like wiki and web part pages are classic experiences,” the support document clarified.

The power of using the OneDrive next-generation sync client with SharePoint Server 2019 will be the there’s no longer a dependency on utilizing the older Groove technology. The Groove client receives deprecated in SharePoint Server 2019. The OneDrive next-generation sync client is fine in “hybrid” (cloud plus on-premises) environments, according to a SharePoint “Intrazone” discussion between Mark Kashman and Bill Baer, that both senior product managers for SharePoint at Microsoft.

Microsoft happens to be touting the effective use of the OneDrive next-generation sync client the way it supports “advanced features which can include Files On-Demand, push notification and IRM [information rights management] protection,” dependant on Microsoft’s “What’s New” document.

Microsoft described other new features in SharePoint Server 2019. A single is integration with PowerApps and Microsoft Flow, which sometimes add automation for apps. Also, buyers will be able to recover content deleted by other users together with the Recycle Bin. Microsoft that is bringing the Office 365 App Launcher to SharePoint Server 2019, which gets enabled via “hybrid team sites and/or OneDrive for Business,” as indicated by Microsoft’s “Reviewer’s Guide” (PDF document) description. Microsoft that is touting “fast site creation” by prospects in SharePoint Server 2019, that can take “five to ten seconds,” from the Intrazone talk.

Baer also notably answered, “No,” when asked by Kashman all through the Intrazone talk if SharePoint Server used on premises were dead.

IT Pro Features
IT pros come across some perks with SharePoint Server 2019.

Microsoft expanded examples of technical limitations weighed against SharePoint Server 2016. For instance, SharePoint Server 2019 now supports file uploads significantly 15GB, which is up from 10GB with SharePoint Server 2016, around the “What’s New” document. Also you can use the # and % characters in file and folder names. Microsoft also increased URL path lengths to 400 Unicode code units (up from any prior 260 code units limit).

The server product can now authenticate to Simple Mail Transfer Protocol (SMTP) servers when sending e-mail messages. It is an infrastructure improvement that’s described within “What’s New” document.

“This [SMTP authentication] makes it much simpler for customers to integrate SharePoint into highly secure environments where authentication should send emails,” the document explained. “Customers will no longer be need to configure smart host relays for SharePoint of the environments.”

In combination with SMTP authentication support, SharePoint Server 2019 will receive a new Health Rule that may be used to check which your “outgoing email credentials are indifferent between all servers,” determined by a Microsoft TechNet blog post. That sort of setup has to store and retrieve the SMTP password that’s used when sending e-mails.

The Hybrid Configuration Wizard is definitely more accessible now, and is launched out from the admin portal. Creating hybrid sites “is do not a lengthy infrastructure task,” in keeping with Microsoft’s Intrazone talk.

Kashman and Baer also talked a bit about using the SharePoint Migration Tool vs. solutions presented by other software vendors to shift SharePoint workloads to Office 365. The SharePoint Migration Tool merely self-service tool for migration scenarios and doesn’t supply you with the breadth of support provided by third-party tools, the course notes said.

Microsoft Premier Field Engineers talked a little about much of the automation capabilities available with SharePoint Server 2019. The upshot of your respective talk was that AutoSPInstaller, a PowerShell tool accustomed to automate SharePoint deployments, “is not going anywhere” using the new SharePoint Server 2019 product. However, SharePointDSC became a “more declarable” tool to run. SharePointDSC, a PowerShell desired state configuration tool that had been formerly labeled as “xSharePoint,” lets organizations define the configuration of SharePoint farms, it can be also designed to check the status with regards to a SharePoint farm after installation, they noted. It has a pull mode keeping the servers inside the desired state, per this Microsoft description.

Backup and Restore operations with SQL Server 2017 on Docker containers using SQL Operations Studio

In this particular 18th piece of the series, we are going to discuss the concepts of database backup-and-restore of SQL Server Docker containers using SQL Ops Studio (SOS). Before proceeding, you must have Docker engine installed and SQL Ops studio configured for your host machine.

The next few paragraphs covers the next topics:

Overview of SQL Operations Studio (SOS)
How to make SQL Ops Studio integrated terminal
Definition of Docker containers
Step by step instructions to initiate backup-and-restore of SQL Server 2017 Docker containers employing the SQL Ops Studio interface
And more…

SQL Ops Studio

Microsoft relieve new light-weight cross-platform GUI tool on the SQL Server Management umbrella is actually SQL Operations Studio. SQL Operations Studio could be a cross-platform graphical user interface for making use of SQL Server instances.

Feature Highlights

It delivers a cross-platform support to manage SQL Server databases for Windows, mac, and Linux or Docker containers on any platform
SQL Server Connection Management that supports
Connection Dialog
Server Groups creation
Azure Integration
Create Registered Servers list
It may also be used to connect to Microsoft’s cloud databases, including Azure SQL Database and Azure SQL Data Warehouse.
In-built Object Explorer support
Advanced T-SQL query editor support
New Query Results Viewer with data grid support
The result-set could be exported to JSON\CSV\Excel
Derive custom charts
Manage Dashboard using standard and customizable widgets and insights
Backup and Restore database dialog
Task History window to check out the task execution status
In-house database scripting options
In-built Git integration
In-built shell support utilizing an integrated terminal options
And more…

The listing continues…

I would recommend of course download the version for use in your platform of to see the way it operates.
Docker containers

Docker carves up a running system into small containers, because both versions is sealed, segmented, along with its own programs and isolated from any devices. Docker’s mission is almost always to build, ship, and run distributed applications across anywhere merely any platform. It will be on your Local Laptop, or across the Cloud, or On-premise servers

Containers are highly portable programs plus its kept no more than possible, and don’t possess external dependencies
It is always easy to make a Docker image and afterwards move or copy it to an alternative and be positive that it’ll still employment in the same way.
To run SQL Server within a Docker container
Docker Engine 1.8 in excess
Minimum of two gigabytes of hard disk space to store the container image, and even two gigabytes of RAM

Kitchen area

Let’s start SQL Operations Studio and open the interactive terminal.

First, let’s download the best SQL Server 2017 extract belonging to the docker hub. To extract, run the docker pull command with the SQL Server 2017 image-tag. It is also very easy to extract the actual Docker container images belonging to the docker hub repository. To achieve the latest SQL image, say hello to the word ?¡ãlatest?¡À, the tag, after a colon. Allows for us product SQL Server 2017 image.
[root@localhost thanvitha]# docker pull microsoft/mssql-server-linux:latest

Now, run the docker image using docker run command.
[root@localhost thanvitha]# docker run -e ‘ACCEPT_EULA=Y’ -e ‘MSSQL_SA_PASSWORD=thanVitha@20151’ -p 1401:1433 –name SQLOpsBackupDemo -d microsoft/mssql-server-linux:latest

The SQL Instance capable accept the connections. Next, to obtain to the SQL Instance, choose the Server icon on your left corner with the window.

Add some connection details

Enter the Internet protocol address (10.2.6.50) within the host machine as well as the incoming port number, for example ,, 1401.
Enter the SA login credentials
Click Connect

Next, stick to the below steps for driving in reverse the databases

To accomplish the backup task, right-click the database and pick databases manage window.

For instance, right-click on SQLShackDemo database and opt for manage. For the database dashboard pane, we have some useful information with current recovery model, high quality time backups were performed concerning the database as well as the log backup, and the database’s owner account.

Now, let’s just do it– click the backup icon. A whole new window would pop-up where you can easlily specify a backup name. SQL Operations Studio suggests title of the database that reference today’s time and date.
Let’s go ahead and trinkets type of backup, in this case, its full backup type.
The backup file location, in this instance, it’s displaying even a full path that is relative to the Docker container. We’re able to also set the settings using advanced configuration options.

Press Backup button to initiate the backup task.

Now, can be seen, the sidebar is changed to the task-history scene on the left. You should check the statuses of the backup job here.

When you’re done looking at that, it is possible to switch retrace to your server sidebar. Hook up to the SQL Container and open the interactive bash terminal using docker command to ensure the backup file that got containing the SQL Ops Studio backup dialog.

[root@localhost thanvitha]# docker exec -it SQLOpsBackupDemo bash

root@cc8f1beae1e1:/# ls -l /var/opt/mssql/data/*.bak
-rw-r—–. 1 root root 434176 May 25 14:26 /var/opt/mssql/data/SQLShackDemo-2018525-10-24-39.bak

Now, let’s dig back into the second portion of the process.

To attempt database restore, I am instantiating a new SQL instance SQLOpRestoreDemo along with the following docker run command.
[root@localhost thanvitha]# docker run -e ‘ACCEPT_EULA=Y’ -e ‘MSSQL_SA_PASSWORD=thanVitha@2015’ -p 1402:1433 –name SQLOpRestoreDemo -d microsoft/mssql-server-linux:latest

Let’s copy the backup file for hosting machine by navigating with a backup file directory. Now, copy the backup file for the host machine to another SQL docker container utilising the following docker cp command.
[root@localhost thanvitha]# docker cp SQLOpsBackupDemo:/var/opt/mssql/data/SQLShackDemo-2018525-10-24-39.bak /tmp

[root@localhost thanvitha]# docker cp /tmp/SQLShackDemo-2018525-10-24-39.bak SQLOpRestoreDemo:/var/opt/mssql/data/

Now, interact with the SQL instance by entering the required details. Here, IP address and then a port number is entered to get to an instance.

Next, go through the restore icon on your dashboard.

For the restore database screen, simply select the general section; simply find the backup file by navigating with the backup directory.

From your files tab, specify areas to relocate your data and log files.

Inside of the options tab, choose overwrite options.

It’s also possible to generate a script and run it or press the restore button to end the restore process.

The effort history appears in the right in the SQL Ops Studio. This concludes of the fact that database SQLShackDemo restored successfully.

Additionally browse the SQL instance to make sure that the database.

That’s all for now…
Wrapping Up

Thus far, you could see the step by step instructions to initiate a database backup and restore SQL Docker containers using SQL Ops Studio interface.

Readily say that it’s a light-weight version of SQL Server Management Studio (SSMS). The interface is not rocket science, straight-forward and self-explanatory. It’s built with several options and is properly laid-out to walk you through common procedures.

Microsoft Sharpens SQL Operations Studio’s Job Management Tools

Microsoft has released a different version of SQL Operations Studio, borrowing key features from G Management Studio (SSMS), along with the ability to track SQL Server Agents familiar with execute scheduled administrative tasks, or “jobs” as is commonly called.

SQL Operations Studio rrs really a free, cross-platform (Windows, macOS and Linux) database management tool that actually works with Microsoft SQL Server, Azure SQL Database and Azure SQL Data Warehouse. Now, as well as listing active SQL Server Agent jobs, the tool enables you to view alerts, operators and proxies, announced Alan Yu, a plan manager at Microsoft SQL Server, in any July 19 blog.

The main Jobs view now carries a visualization that works as a glanceable reference on previously executed jobs and if they passed or failed, Yu noted. Zeroing in upon a specific job among an extended list has grown to become easier from new filtering capabilities, he added. Further diminishing the requirement to switch to SQL Server Management Studio, the tool now includes new dialog boxes that permit users to be able to new jobs, alerts and operators.

Following high the June 2018 beta relieve the SQL Server Profiler extension for SQL Operations Studio, Microsoft’s developers have added new keyboard shortcuts and alternatives to help users establish their monitoring environments faster. It’s name implies, SQL Server Profiler allows users to server activity for monitoring and troubleshooting purposes.

Today, users can put on the shortcuts to quickly launch, stop and start the profiler. The extension also now includes templates for five default views which offer key insights into a database server’s operation. “When clicking on each kind have, a different listing of columns will generate into your Profiler view with the intention to focus on the areas your investigating,” Yu explained.

Out of your budding SQL Operations Studio comes a script-management extension from Cobus Kruger.

This latest Combine Scripts extension enables users to undertake scripts which have been located across multiple folders. Following user selects the required script files, the tool outcomes in a single file which really can be run or stored for safekeeping.

Developers thinking of contribute to the SQL Operations Studio extension ecosystem now have new options which could affect the usability of their own creations. Authors is now able to add wizards and informative dialogs inside their extensions, Yu revealed. Wizards are meant for guiding users by using a multi-step process while dialogs help for drawing concentration on some aspect in connection with the use of an expansion.

Meanwhile, Microsoft happens to be warning database administrators that support for SQL Server 2008 and 2008 R2 is quickly drawing for end.

In less than a year, on July 9, 2019, Microsoft will eradicate issuing security updates for individuals older versions of SQL Server. However, customers should purchase more time if he or she move their SQL Server 2008 and 2008 R2 database to Azure in contrast to upgrading up to a newer version. Microsoft can provide migrators with security updates for the three years big event support deadline, the provider announced on July 12.

For expert thoughts on deploying SQL Server within public cloud, particularly Google Cloud, peruse this recent eWEEK Data Point interview with Dave Bermingham, Microsoft Cloud and Data Center Management MVP at SIOS Technology.

LibreOffice for Windows 10 Released on Microsoft Store with $2.99 Sale price

LibreOffice, which right away is considered the top secret to Microsoft Office, landed located on the Microsoft Store for Windows 10 devices, though the listing seems a bit more bizarre in the early stages.

The Document Foundation presented LibreOffice as a completely free productivity suite, yet in the Microsoft Store, it truly is offered accompanied by a $2.99 selling price.

There are several signs in which the may be just a fake listing from someone attempting to make some money from LibreOffice’s popularity.

As you move $2.99 price is said used for donations only, site publisher isn’t The Document Foundation, but a developer called .net. Furthermore, there’s no official announcement on TDF’s website, and also the Microsoft Store listing only has two reviews, in several being the best Office alternative and it’s offered without charge.
“No official word onto it yet”

Somewhat surprising would be the fact getting this alleged LibreOffice for Windows 10 actually downloads the productivity suite, so you’ll lead to using Writer and Math rather like when downloading the Win32 installer. The app is on the market as a trial, to ensure the $2.99 price, but users are said to be able to continue running it although the offer ends up.

Without any official announcement in this way, I recommend you in order to avoid downloading this version coming from a Store, or you do, a minimum of don’t pay for it. LibreOffice is basically free of charge, and you can also always to get the Win32 installer to savour all its features without having to spend a single cent to it.

We have reached in the market to Microsoft to ask whether it’s really a fake listing or otherwise not, and if it is very, expect it to be removed shortly. This wouldn’t end up being first time someone tries to capitalize on the buzz of a well-known app, though with Microsoft focusing that much on quality much less on quantity, a replica LibreOffice version inside Microsoft Store is usually quite unexpected.

Microsoft Warns of SQL Server 2008 End of Support in under a Year

In less than a year, Microsoft can easily support for SQL Server 2008, meaning forget about the updates and without any more support, but perhaps more problems about the security and compliance fronts for organizations that do not migrate to newer options.

Subsequent a 10-year run, Microsoft is giving enterprises associated with time and plenty of thoughts on how to deal with swapping out their entrenched SQL Server 2008 and SQL Server 2008 R2 installations, preferably onto the Azure cloud (while upcoming Azure SQL Database Managed Instance is often an option) or go with the latest on-premises version, SQL Server 2017.

“End of support means the end of regular security updates,” said Microsoft exec Takeshi Numoto during a blog post yesterday evening. “With cyberattacks becoming more sophisticated and frequent, running apps and knowledge on unsupported versions can build significant security and compliance risks. The 2008 category of products was just the thing for its time, but we endorse upgrading in the most current versions for better performance, efficiency, and regular security updates.”

Microsoft is recommending two alternatives for enterprise upgrades: migrating in the Azure cloud or doing an in-premises upgrade.

Using the preferred Azure option, the corporation is offering free Extended Security Updates for you to secure 2008, R2 and Windows Server workloads for several more years after a July 9, 2019, end-of-support cut-off, recognizing “that it can be hard to upgrade everything just before the end of support timeline.” Support ends for Windows Server 2008/2008 R2 on Jan. 14, 2020

“You could perhaps move your SQL Server 2008 and 2008 R2 deployments minus the application code change and near zero downtime to Azure SQL Database Managed Instance,” Numoto said. “It is regarded as a fully-managed database-as-a-service solution with top rated SLAs and does not require future upgrades. Azure SQL Database Managed Instance will probably be generally obtainable in early Q4 on this calendar year.”

Extended Security Updates can also be available for on-premises upgrades, only for purchase basically by organizations with Software Assurance or Subscription licenses under an Enterprise Agreement enrollment.

This business provides details on the upgrades in 2008 End of Support Resource Center. That site guides organizations in finding a migration partner or undertaking task themselves making use of a three-step process: assess, migrate and optimize.

More info on end-of-support options can also available in a July 12 webinar you can do on-demand.

Windows Server 2019 tweaked to give up it getting clock-blocked

Microsoft Windows Server 2019, coming later in 2012, will include UTC-compliant leap second support, for both added and subtracted time. But there will be no smearing.

Since 1972, leap seconds seem to have been added to Coordinated Universal Time (UTC) to create for divergence with mean solar time (UT1) – which slows aided by the Earth’s rotation – and International Atomic Time (TAI) – usually atomic clocks that’s presently 37 seconds before UTC. Leap seconds serve to keep the gap between UTC and UT1 under 0.9 seconds.

Dating, there have been 27 leap seconds added – when clocks show 23:59:60 and never rolling over to 00:00:00 after 23:59:59. The additional 10 seconds arrived as the bulk adjustment in 1972.

There’s never been a leap second subtracted – when a clock would alternate from 23:59:58 to 00:00:00, dropping 23:59:59 entirely. But Microsoft has included support for negative leap seconds just in example. (This feels like the sort of obscure capability may well form the motive for an interesting financial market hack.)

Inside of a blog post, Dan Cuomo, part of the Windows Core Networking team, suggests the right moment handling refinements follow from regulations in the us alone (FINRA) and the EU (ESMA/MiFIDII) that demand accuracy within 100 microseconds.

Rules requiring greater accuracy means leap second smearing – that leap seconds are sliced into pieces and added gradually later in the day – isn’t a suitable option utilizing contexts. Objective it’s worth, Google supports leap second smearing with its Network Time Protocol (NTP) servers.

In accordance with Cuomo, leap second smearing possess an error close to ±0.5 seconds when it comes to UTC, which falls short of modern regulatory demands. That makes it not supported in Windows Server 2019.
Regulations, regulations, regulations

Windows Server 2016 included one millisecond time accuracy, which met some regulatory requirements when i bought it. Windows Server 2019 promises further improvement with compliant leap second support, greater accuracy – by having a new Precision Time Protocol (PTP), Software Timestamping and Clock Source Stability – and traceability through logs and satisfaction counters.

NTP, explains Cuomo, remains the default time synchronization mechanism in Windows, nevertheless it has a shortcoming, namely dealing round-trip delays (latency) within symmetric network.

PTP (IEEE 1588v2) is intended for customers with stringent time accuracy requirements. “PTP enables network devices so as to add the latency introduced by each network device towards the timing measurements thereby providing a far more accurate time sample to the endpoint (Windows Server 2019 or Windows 10, host or virtual machine),” Cuomo said.

In furtherance of timing accuracy in Windows Server 2019, Software Timestamping adds timestamps to timing packets after and before they’re processed by Windows networking components, as the components can also add delays of from around 30 to 200 microseconds. Current resources such as this information, corrections can be produced.

Then there’s Clock Source Stability, a way of improving the accuracy for this system clock with time. For Windows Server 2019, involving taking multiple time samples, deleting the outliers, and “disciplining the clock” to keep closer sync using time server.

Determined by Cuomo, Microsoft’s partner Sync-N-Scale measured a pre-release sort of Windows Server 2019 over 3.5 days and found its MIN Time Offset exhibited only 41 microseconds root mean square (RMS) divergence from UTC.

So that you can ensure traceability, Windows Server 2019 supports additional log events allowing admins to explore whether system clocks were altered, regardless if the clock frequency was modified, and whether or not the Windows Time service configuration was changed.

“Previous time accuracy requirements were lax by today’s standards,” said Cuomo. “Now regulated industries have a lot of more stringent accuracy requirements but accuracy alone isn’t really enough – Your systems requirements be traceable.”

And disciplined.

Microsoft Releases Windows Server 2019 Build 17713

Microsoft has released completely new Windows Server 2019 build to insiders, though this happening the changelog isn’t one of the compelling reason to put on it.

It is because Windows Server 2019 build 17713 doesn’t include any extra features, and users are instead pointed towards the improvements that are made in the earlier version releases.

Nevertheless, there are several additional downloads that you might try out, and one of them is Windows Admin Center Preview 1807.

Utilizing this release, Microsoft finally introduces a streamlined experience to get in touch a gateway to Azure, plus an electronic machine inventory page that supports multi-select. It means that users may perform the same actions on multiple virtual machines at the same time, a top feature request within the Insider program.
“New features”

That there are two new features during this release, namely file sharing on the Files app and Azure refinements. The Files app now allows admins to build and remove users and groups, and control their permission level.

On Azure, Microsoft has integrated Azure Update Management inside the Updates tool.

“With Windows Admin Center, you’ll be able to set up and employ Azure Update Management to keep your managed servers knowledgeable. If you don’t surely have an Azure Monitoring workspace in your Azure subscription, Windows Admin Center will automatically configure your server to create the necessary Azure resources within your subscription as well as placement you specify. When you’ve got an existing Azure Monitoring workspace, Windows Admin Center can automatically configure your server of food updates from Azure Update Management,” the business explains.

This new Windows Server 2019 build every year on December 14 in 2012. It also is included with several known issues, that you can check in full for the box marriage jump. As usual, users ought to share feedback as a way to help improve performance in front of the next releases.

Using SQL Server’s Default Trace in order to Autogrow Events in tempdb

Clothing that you should work to size tempdb appropriately, therefore it doesn’t need to autogrow soon after starting up SQL Server. It isn’t easy to do this. Therefore, when you initially implement a completely new server and/or add new databases you need to definitely monitor the autogrowth events on tempdb. By monitoring the autogrowth events you are able to determine if you possess sized tempdb appropriately.

Is going to be default trace enabled for your very own server, you can use the script below to realize all your tempdb autogrowth events. Note the default trace is enabled by default. If you run this code and this returns any autogrowth events, you very well may want to re-establish the original size of tempdb to carry these autogrowth events from occurring.

— Declare variables
DECLARE @filename NVARCHAR(1000);
DECLARE @bc INT;
DECLARE @ec INT;
DECLARE @bfn VARCHAR(1000);
DECLARE @efn VARCHAR(10);

— Buy the name of the present default trace
SELECT @filename = CAST(value AS NVARCHAR(1000))
FROM ::fn_trace_getinfo(DEFAULT)
WHERE traceid = 1 AND property = 2;

— rip apart file name into pieces
SET @filename = REVERSE(@filename);
SET @bc = CHARINDEX(‘.’,@filename);
SET @ec = CHARINDEX(‘_’,@filename)+1;
SET @efn = REVERSE(SUBSTRING(@filename,1,@bc));
SET @bfn = REVERSE(SUBSTRING(@filename,@ec,LEN(@filename)));

— set filename without rollover number
SET @filename = @bfn + @efn

— process all trace files
SELECT
ftg.StartTime
,te.name AS EventName
,DB_NAME(ftg.databaseid) AS DatabaseName
,ftg.Filename
,(ftg.IntegerData*8)/1024.0 AS GrowthMB
,(ftg.duration/1000)AS DurMS
FROM ::fn_trace_gettable(@filename, DEFAULT) AS ftg
INNER JOIN sys.trace_events AS te ON ftg.EventClass = te.trace_event_id
WHERE (ftg.EventClass = 92 — Date File Auto-grow
OR ftg.EventClass = 93) — Log File Auto-grow
AND DB_NAME(ftg.databaseid) = ‘tempdb’
AND ftg.Starttime > (SELECT login_time FROM sys.dm_exec_sessions WHERE session_id = 1)
ORDER BY ftg.StartTime;

T-SQL Tuesday #104: Just Can’t Cut That Cord

We all have our favorite scripts, tools or utilities. Those are the things that help make our jobs easier. Some of us may have an unhealthy relationship with some of those scripts (similar in nature to the relationship many have with their phone). Whether or not the need to cut that proverbial cord exists, today we are not discussing the health of that dependence. Suffice it to say, sometimes we simply need to upgrade our scripts. How else can we get better scripts or make our scripts better – by sharing them.

This is precisely the goal Bert Wagner (b | t) seems to have envisioned for the 104th installment of TSQL Tuesday.

If you are interested in reading the original invite, you can find that here.

“For this month” T-SQL Tuesday, I want you to write about code you’ve written that you would hate to live without.

Maybe you built a maintenance script to free up disk space, wrote a query to gather system stats for monitoring, or coded some PowerShell to clean up string data. Your work doesn’t need to be completely original either – maybe you’ve improved the code in some open source project to better solve the problem for your particular situation.”

There is a high probability that through the sharing of your script, somebody out there can benefit from that script. In addition, it is very likely that somebody will make a suggestion to help make your script better. Worst case (emphasis on worst case here), you have the script stored somewhere with half decent instructions on what it does and making it easily accessible for you to use again and again. Just in case you forget you have it out there – you can google for it again and find it on your own blog ;).

Personally, I have been able to find and re-use some of my older scripts. Not only do I get to re-discover them, but I also get to re-imagine a new use or improvement for the script.
Brief Intermission

A shout out is absolutely necessary for Adam Machanic (twitter) for picking the right blog meme that has been able to survive so long in the SQLFamily. This party has helped many people figure out fresh topics as well as enabled them to continue to learn.

Easy Access

While pondering the topic for today, I had the thought occur about how frequently I post a script on my blog already anyway. An easy out for this topic would have been to re-share one of those old scripts. For instance, I could easily redo a recent article about server access that has a couple scripts demonstrated in it. Or I could go back a few years to my articles about foreign keys (here or here) and space use (here or here). Even more intriguing could be to re-envision some of my articles on Extended Events. But where would the fun in that be?

Rather than take the easy road and rehash something, I have something different. This one goes hand in hand with the numerous articles and scripts I have previously provided on auditing – yet it is different.

Not every shop can afford third party software or even Enterprise edition and so they have to come up with a different way to audit their database instances. One of the problems with a home grown solution is to ensure the data is not stored local to the server (lots of good reasons for that). Here is an example of what I did for one client that happened to have a developer that found a back door that was giving him SA access to the SQL Server Instance and was changing things and trying to cover his tracks – even after being warned.
First the query

This query will be run from a job on a different server that is restricted in access to just a select few people. I do rely on the use of the default trace in this query. I am also reliant upon a little bit of sneaky behavior. If I run this from a separate server, prying eyes are usually unlikely to find that it is running and thus makes it easier to catch them red-handed. In addition, if they discover via some sort of trace and by a lot of luck that it is running, then they have no access to the remote server to alter anything that was captured.

The query does go out to the default trace and pull back any changes to permissions or principals on the server in question. The captured data is then stored in a database that is also restricted to a select few people. Lastly, the captured data can be routinely queried, or automated reports can be created to send email notifications of changes encountered.

INSERT INTO DBA.[AUDIT].[DefTracePermissions]
([SvrName]
,[EventTimeStamp]
,[EventCategory]
,[spid]
,[subclass_name]
,[LoginName]
,[DBUserName]
,[HostName]
,[DatabaseName]
,[ObjectName]
,[TargetUserName]
,[TargetLoginName]
,[SchemaName]
,[RoleName]
,[TraceEvent]
,[ApplicationName])
SELECT [SvrName]
,[EventTimeStamp]
,[EventCategory]
,[spid]
,[subclass_name]
,[LoginName]
,[DBUserName]
,[HostName]
,[DatabaseName]
,[ObjectName]
,[TargetUserName]
,[TargetLoginName]
,[SchemaName]
,[RoleName]
,[TraceEvent]
,[ApplicationName]
FROM OPENQUERY([SomeServer],
‘DECLARE @Path VARCHAR(512)
,@StartTime DATE
,@EndTime DATE = getdate()

/* These date ranges will need to be changed */
SET @StartTime = dateadd(dd, datediff(dd, 0, @EndTime) – 1, 0)

SELECT @Path = REVERSE(SUBSTRING(REVERSE([PATH]),
CHARINDEX(”\”, REVERSE([path])), 260)) + N”LOG.trc”
FROM sys.traces
WHERE is_default = 1;
SELECT @@servername as SvrName,gt.StartTime AS EventTimeStamp, tc.name AS EventCategory,spid
,tv.subclass_name
,gt.LoginName,gt.DBUserName,gt.HostName
,gt.DatabaseName,gt.ObjectName,gt.TargetUserName,gt.TargetLoginName,gt.ParentName AS SchemaName
,gt.RoleName,te.name AS TraceEvent
FROM ::fn_trace_gettable( @path, DEFAULT ) gt
INNER JOIN sys.trace_events te
ON gt.EventClass = te.trace_event_id
INNER JOIN sys.trace_categories tc
ON te.category_id = tc.category_id
INNER JOIN sys.trace_subclass_values tv
ON gt.EventSubClass = tv.subclass_value
AND gt.EventClass = tv.trace_event_id
WHERE 1 = 1
AND CONVERT(date,gt.StartTime) >= @StartTime
AND CONVERT(date,gt.StartTime) <= @EndTime
and tc.name = ”Security Audit”
AND gt.TargetLoginName IS NOT NULL
ORDER BY gt.StartTime;’);

The second part of the trickery here is that I am using a linked server to perform the queries (a slight change and I could also do this via powershell which will be shown in a future article). The linked server query uses the openquery format and sends the default trace query to the remote server. Since I am running this from a job on an administrative server that pulls a limited data set, I am not overly concerned with the linked server setup here.
Storing It

Once I query the data, I need to put it somewhere on my administrative server. The table setup for that is very straight forward.

USE [DBA]
GO

IF SCHEMA_ID(‘AUDIT’) IS NULL
BEGIN
EXECUTE (‘CREATE SCHEMA [AUDIT]’);
END

CREATE TABLE [AUDIT].[DefTracePermissions](
[DTPermID] [bigint] IDENTITY(1,1) NOT NULL,
[SvrName] [varchar](128) NOT NULL,
[EventTimeStamp] [datetime] NOT NULL,
[EventCategory] [varchar](128) NULL,
[spid] [int] NULL,
[subclass_name] [varchar](128) NULL,
[LoginName] [varchar](128) NULL,
[DBUserName] [varchar](128) NULL,
[HostName] [varchar](128) NULL,
[DatabaseName] [varchar](128) NULL,
[ObjectName] [varchar](128) NULL,
[TargetUserName] [varchar](128) NULL,
[TargetLoginName] [varchar](128) NULL,
[SchemaName] [varchar](256) NULL,
[RoleName] [varchar](64) NULL,
[TraceEvent] [varchar](128) NULL,
[ApplicationName] [varchar](256) NULL,
CONSTRAINT [PK_DefTracePermissions] PRIMARY KEY CLUSTERED
(
[DTPermID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

After creating this table, I am ready to store the data. All I need to do is throw the audit query into an agent job and schedule it to run on a regular schedule. For my purposes, I usually only run it once a day.