You can use PowerShell to calculate the exact size of a specific folder in Windows (recursively, including all subfolders). This way you can quickly find out the size of the directory on disk without using third-party tools such as TreeSize or WinDirStat. In addition, PowerShell gives you more flexibility to filter or exclude files (by type, size, or date) that you need to consider when calculating folder size.
Use the following PowerShell cmdlets to calculate the size of a folder:
- Get-ChildItem (
gci
alias) — gets a list of files (with sizes) in a directory (including nested subfolders). Previously, we showed you how to use the Get-ChildItem cmdlet to find the largest files on the disk. - Measure-Object (
measure
alias) – sums the sizes of all files to get the total directory size.
For example, to find the size of the D:\ISO directory in GB, run:
(Get-ChildItem D:\ISO -force -Recurse -ErrorAction SilentlyContinue| measure Length -sum).sum / 1Gb
Parameters used:
-Force
– include hidden and system files-Recure
– get a list of files in subfolders-ErrorAction SilentlyContinue
– ignore files and folders the current user is not allowed to access-measure Length -sum
– a sum of all file sizes (the Length property).sum/ 1Gb
– show total size in GB
In this example, the directory size is about 37 GB (this PowerShell command ignores NTFS file system compression if it is enabled).
To round the results to two decimal places, use the command:
"{0:N2} GB" -f ((Get-ChildItem D:\ISO -force -Recurse -ErrorAction SilentlyContinue| measure Length -sum).sum / 1Gb)
PowerShell can find the total size of all files of a particular type in a directory. For example, add the *.iso parameter to find out how much space is taken up by ISO files:
"{0:N2} GB" -f ((Get-ChildItem D:\ISO *.iso -force -Recurse -ErrorAction SilentlyContinue| measure Length -sum).sum / 1Gb)
You can use other filters to select files to be included in the directory size calculation. For example, to see the size of files in the directory created in 2024:
(gci -force D:\ISO –Recurse -ErrorAction SilentlyContinue | ? {$_.CreationTime -gt '1/1/24’ -AND $_.CreationTime -lt '12/31/24'}| measure Length -sum).sum / 1Gb
If there are symbolic or hard links in the directory, the above PowerShell cmdlet displays an incorrect folder size. For example, the C:\Windows
directory contains many hard links to files in the WinSxS folder (Windows Component Store). Such files may be counted several times. Use the following command to ignore hard links:
"{0:N2} GB" -f ((gci –force C:\Windows –Recurse -ErrorAction SilentlyContinue | Where-Object { $_.LinkType -notmatch "HardLink" }| measure Length -s).sum / 1Gb)
Get the sizes of all top-level subfolders in the destination folder and the number of files in each subfolder (in this example, the PowerShell script displays the size of all user profiles in C:\Users
):
$targetfolder='C:\Users' $dataColl = @() gci -force $targetfolder -ErrorAction SilentlyContinue | ? { $_ -is [io.directoryinfo] } | % { $len = 0 gci -recurse -force $_.fullname -ErrorAction SilentlyContinue | % { $len += $_.length } $filesCount = (gci -recurse -force $_.fullname -File -ErrorAction SilentlyContinue | Measure-Object).Count $dataObject = New-Object PSObject -Property @{ Folder = $_.fullname SizeGb = ('{0:N3}' -f ($len / 1Gb)) -as [single] filesCount=$filesCount } $dataColl += $dataObject } $dataColl | Out-GridView -Title "Subfolder sizes and number of files"
%
is an alias for the foreach-object
loop.The script will display the Out-GridView graphical table listing the directories, their sizes, and how many files they contain. The folders in the table can be sorted by size or number of files in the Out-GridView form. You can also export the results to a CSV (| Export-Csv folder_size.csv
) or to an Excel file.
20 comments
Great script! Many thanks.
Just adding here: If you want to display the output directly to the screen instead of the grid (for example within a docker container), use:
$dataColl | Write-Output
Thanks for your addition!
Well written article, thanks!
I do have one suggestion, I have to run powershell in a command prompt but can’t run it in a script (running remotely with a tool that doesn’t have access to powershell). The formatting ({0:N2}) and the pipe was not allowing the switches to work. So I had to replace ‘{0:N2}’ with ‘[math]round(…)’. And I had to escape the pipe with a carrot. End result looked like:
[math]round((gci –force c:\Windows –Recurse -ErrorAction SilentlyContinue ^| measure Length -s).sum / 1Gb)
And that got me exactly what I needed. But I wouldn’t have gotten the jump start I needed without this guidance.
Thanks again!
Hello,
How can I add a code line for files?
Thank you
I add below command to get count result as well but it give only one level of sub directory , it doesn’t count sub sub folder file count , can you help with it
Add-Member -InputObject $dataobject -MemberType NoteProperty -name “count” -Value (getchileitem dir $_.FullName -recurse | Measure-Object).Count
This what I am exactly looking for. One request. How can I export the final result to an .csv.
I did it manually, selected all the lines, copied and pasted it into excel.
It has already pasted separated by columns.
If you have the script to do this automatically it will be better.
Export-Csv “Path to the CSV file”
Last script to show sizes of all subfolders in GUI doesn’t work. Previous script worked, but this just hangs in PS and nothing happens…
Great Script!!! Thank you!
I had to manually delete and place double quotes. So it worked!
Just sorting by size is not working for me. It is classifying as text. It is classifying thus:
97,5
9
80,6
8
But it’s just a detail that doesn’t take away from the script’s merits. Great script!
[…] for an easy way to get the size of all user profiles folder through PowerShell and found this woshub and inspired by it ended with the following function which measures the size of everything in […]
Nice code, it workds
I like the third party tool: Directory Report
It can filter by file types, modification date, size and owner
It can save its output to many file types including directly to MS-Excel
Sorting fix:
$targetfolder=’C:\’
$dataColl = @()
gci -force $targetfolder -ErrorAction SilentlyContinue | ? { $_ -is [io.directoryinfo] } | % {
$len = 0
gci -recurse -force $_.fullname -ErrorAction SilentlyContinue | % { $len += $_.length }
$foldername = $_.fullname
$foldersize= [math]::Round(($len / 1Gb),2)
$foldersizeint = [int]$foldersize
$dataObject = New-Object PSObject
Add-Member -inputObject $dataObject -memberType NoteProperty -name “foldername” -value $foldername
Add-Member -inputObject $dataObject -memberType NoteProperty -name “foldersizeGb” -value $foldersize
$dataColl += $dataObject
}
$dataColl | Out-GridView -Title “Size of subdirectories”
Unbelievable, a basic daily task needs…. a program? well done MS!! What’s next?
Will you replace the “cls” with a 20-line program to clean the screen?
Actually, cls is an alias to Clear-Host function.
you can find it with this command
Get-Command cls:
CommandType Name Version Source
———– —- ——- ——
Alias cls -> Clear-Host
and get Clear-Host function definition body using this:
Get-Command Clear-Host | select -ExpandProperty Definition
$RawUI = $Host.UI.RawUI
$RawUI.CursorPosition = @{X=0;Y=0}
$RawUI.SetBufferContents(
@{Top = -1; Bottom = -1; Right = -1; Left = -1},
@{Character = ‘ ‘; ForegroundColor = $rawui.ForegroundColor; BackgroundColor = $rawui.BackgroundColor})
# .Link
# https://go.microsoft.com/fwlink/?LinkID=2096480
# .ExternalHelp System.Management.Automation.dll-help.xml
Without its comments, It’s five-line codes.
But I think get folder size is so useful that PowerShell can include a command for it.
If you are on a file system that has big cluster size or are trying to find length with byte precision you can’t do that with this script. The cluster size will become your minimum size and all other sizes will be multiples of this.
Unfortunately there is no way in PowerShell to do the same as you do it in explorer. Or at least nobody knows of such.
Is it possible to add a member type for file count in each folder? This is perfect and edited a tiny bit to match what we need for planning file server migrations (well at least the planning)