- Get link
- X
- Other Apps
- Get link
- X
- Other Apps
-- Verificar Integridade Física do Banco
DBCC CHECKDB ('SeuBancoDeDados') WITH NO_INFOMSGS, ALL_ERRORMSGS;
-- Consultar Estatísticas de Fragmentação de Índices
SELECT
dbschemas.[name] AS 'Schema',
dbtables.[name] AS 'Table',
dbindexes.[name] AS 'Index',
indexstats.avg_fragmentation_in_percent
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, 'LIMITED') AS indexstats
INNER JOIN sys.tables dbtables ON dbtables.[object_id] = indexstats.[object_id]
INNER JOIN sys.schemas dbschemas ON dbtables.[schema_id] = dbschemas.[schema_id]
INNER JOIN sys.indexes AS dbindexes ON dbindexes.[object_id] = indexstats.[object_id]
AND indexstats.index_id = dbindexes.index_id
WHERE indexstats.avg_fragmentation_in_percent > 30;
-- Monitorar Sessões com Bloqueios
SELECT
blocking_session_id AS Bloco,
session_id AS Sessão,
wait_type,
wait_time,
wait_resource,
text AS Query
FROM sys.dm_exec_requests
CROSS APPLY sys.dm_exec_sql_text(sql_handle)
WHERE blocking_session_id <> 0;
-- Top 10 Queries Mais Pesadas (CPU)
SELECT TOP 10
qs.total_worker_time / qs.execution_count AS AvgCPUTime,
qs.execution_count,
qs.total_worker_time,
st.text AS QueryText
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) st
ORDER BY AvgCPUTime DESC;
-- Verificar Uso de Espaço por Tabela
EXEC sp_MSforeachtable
@command1="EXEC sp_spaceused '?'";
-- Consultar Estatísticas Desatualizadas
SELECT
OBJECT_NAME(s.[object_id]) AS TableName,
s.name AS StatsName,
STATS_DATE(s.[object_id], s.stats_id) AS LastUpdated
FROM sys.stats AS s
WHERE STATS_DATE(s.[object_id], s.stats_id) < DATEADD(DAY, -7, GETDATE());
-- Verificar Contenção de TempDB
SELECT
session_id,
wait_type,
wait_duration_ms,
resource_description
FROM sys.dm_os_waiting_tasks
WHERE wait_type LIKE 'PAGE%LATCH_%' AND resource_description LIKE '2:%';
-- Tamanho Atual e Espaço Livre dos Arquivos do Banco
SELECT
name AS FileName,
size / 128.0 AS SizeMB,
size / 128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS INT) / 128.0 AS FreeSpaceMB
FROM sys.database_files;
-- Validação de Consistência de Dados (Exemplo em PKs/FKs)
SELECT fk.name AS ForeignKey, OBJECT_NAME(fk.parent_object_id) AS ChildTable
FROM sys.foreign_keys fk
WHERE NOT EXISTS (
SELECT 1
FROM sys.foreign_key_columns fkc
JOIN sys.columns c ON fkc.parent_object_id = c.object_id AND fkc.parent_column_id = c.column_id
WHERE fkc.constraint_object_id = fk.object_id);
-- Ver Sessões Ativas e Consumo
SELECT
s.session_id,
s.login_name,
r.status,
r.cpu_time,
r.total_elapsed_time,
t.text AS SqlText
FROM sys.dm_exec_sessions s
JOIN sys.dm_exec_requests r ON s.session_id = r.session_id
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) t;
MORE ...
SQL Scripts
https://github.com/CloudDBM/mentor
https://github.com/A-poc/BlueTeam-Tools
Blue Team Tips 4 tips
- Payload extraction with Process Hacker @embee_research
- Prevent Script Execution via Double Click Default Application GPO Change
- Detect Cryptojacking Malware with Proxy Logs Dave Mckay
- Remove null bytes in CyberChef malware analysis @Securityinbits
Network Discovery and Mapping 6 tools
Vulnerability Management 4 tools
- OpenVAS Open-source vulnerability scanner
- Nessus Essentials Vulnerability scanner
- Nexpose Vulnerability management tool
- HackerOne Bug Bounty Management Platform
Security Monitoring 10 tools
- Sysmon System Monitor for Windows
- Kibana Data visualization and exploration
- Logstash Data collection and processing
- parsedmarc Email DMARC data visualisation
- Phishing Catcher Phishing catcher using Certstream
- maltrail Malicious traffic detection system
- AutorunsToWinEventLog Windows AutoRuns Event Parser
- procfilter YARA-integrated process denial framework
- velociraptor Endpoint visibility and collection tool
- SysmonSearch Sysmon event log visualisation
Threat Tools and Techniques 11 tools
- lolbas-project.github.io Living Off The Land Windows Binaries
- gtfobins.github.io Living Off The Land Linux Binaries
- filesec.io Attacker file extensions
- KQL Search KQL query aggregator
- Unprotect Project Malware evasion techniques knowledge base
- chainsaw Fast Windows Forensic Artefacts Searcher
- freq Domain generation algorithm malware detection
- yarGen YARA rule generator
- EmailAnalyzer Suspicious emails analyser
- VCG Code security scanning tool
- CyberChef GCHQ online data manipulation platform
Threat Intelligence 4 tools
- Maltego Threat Intelligence Platform
- MISP Malware Information Sharing Platform
- ThreatConnect Threat data aggregation
- Adversary Emulation Library An open library of adversary emulation plans
Incident Response Planning 5 tools
- NIST Cybersecurity Framework
- Incident Response Plan Framework for incident response
- Ransomware Response Plan Framework for ransomware response
- Incident Response Reference Guide Incident preparation guidance paper
- Awesome Incident Response List of tools for incident response
Malware Detection and Analysis 11 tools
- VirusTotal Malicious IOC Sharing Platform
- IDA Malware disassembler and debugger
- Ghidra Malware reverse engineering tool
- decode-vbe Encoded VBE script decoder
- pafish Virtual machine sandbox detector
- lookyloo Phishing domain mapping
- YARA Malware identification via pattern matching
- Cuckoo Sandbox Malware analysis sandbox
- Radare2 Reverse engineering framework
- dnSpy .NET debugger and assembly editor
- malware-traffic-analysis.net Malware and packet capture samples
Data Recovery 3 tools
- Recuva File recovery
- Extundelete Ext3 or ext4 partition recovery
- TestDisk Data Recovery
Digital Forensics 3 tools
- SANS SIFT Forensic toolkit
- The Sleuth Kit Disk images analysis tools
- Autopsy Digital forensics platform
Security Awareness Training 4 tools
- TryHackMe Cyber security challenges platform
- HackTheBox Cyber security challenges platform
- CyberDefenders Blue team cyber security challenges platform
- PhishMe Phishing training
Communication and Collaboration 2 tools
- Twitter Cyber Security Accounts
- Facebook TheatExchange Malicious indicators sharing platform
Learn from Blue Teamers with a collection of Blue Teaming Tips. These tips cover a range of tactics, tools, and methodologies to improve your blue teaming abilities.
🔙Payload extraction with Process Hacker
Description: 'Malware Analysis Tip - Use Process Hacker to watch for suspicious .NET assemblies in newly spawned processes. Combined with DnSpy - it's possible to locate and extract malicious payloads without needing to manually de-obfuscate.'
Credit: @embee_research
# DatabaseDeployer
A professional-grade SQL Server database deployment tool built in C# that orchestrates DACPAC deployments with pre-deploy and post-deploy migration scripts.
## The Philosophy
Forget running loose SQL scripts and hoping they execute in the right order. DatabaseDeployer uses Microsoft's battle-tested DACPAC technology for declarative, state-based database deployments, combined with migration scripts for data transformations. This is the industrial-strength approach that serious database DevOps teams use.
## What It Does
DatabaseDeployer executes database deployments in three precise phases:
1. **Pre-Deployment Scripts** - Backup operations, pre-migration data prep, anything that needs to run before schema changes
2. **DACPAC Deployment** - Smart, differential schema deployment using Microsoft.SqlServer.DacFx
3. **Post-Deployment Scripts** - Data migrations, reference data updates, post-deployment cleanup
All script executions are tracked in a `DeploymentScripts` table, ensuring idempotent deployments. Scripts that have already been executed successfully are automatically skipped.
## Features
- **Interactive Mode** - Double-click the exe to get help and enter commands interactively
- **State-Based DACPAC Deployment** - Define your database schema declaratively, let the tool generate the diff
- **Script Orchestration** - Pre-deploy and post-deploy scripts run in alphabetical order
- **Execution Tracking** - Scripts are tracked and won't re-run unless they fail
- **Data Loss Protection** - Configurable blocking when schema changes might destroy data
- **Deployment Script Generation** - Preview the SQL before it executes
- **Comprehensive Logging** - Console and file logging with log4net
- **CI/CD Ready** - Command-line interface perfect for Azure DevOps, GitHub Actions, etc.
- **Cross-Platform** - Built on .NET 8, runs on Windows and Linux
## Installation
### Build from Source
```powershell
git clone <your-repo-url>
cd DatabaseDeployer
dotnet build -c Release
```
The compiled executable will be in `src/DatabaseDeployer.Cli/bin/Release/net8.0/`
### Package as a Tool (Optional)
```powershell
dotnet pack src/DatabaseDeployer.Cli/DatabaseDeployer.Cli.csproj -c Release
dotnet tool install --global --add-source ./src/DatabaseDeployer.Cli/bin/Release DatabaseDeployer.Cli
```
## Usage
### Interactive Mode (Double-Click Friendly!)
Simply run the executable without arguments (or double-click it in Windows Explorer):
```powershell
DBD.exe
```
You'll see:
1. Full help menu with all options and examples
2. A prompt asking you to enter your command
3. Type your deployment command (or 'exit' to quit)
4. The tool executes your command
5. Press any key to close
**Example Interactive Session:**
```
================================================================================
DatabaseDeployer - Interactive Mode
================================================================================
USAGE:
DBD.exe [OPTIONS]
OPTIONS:
-c, --connection <string> Required. SQL Server connection string
...
Enter your command (or 'exit' to quit): --connection "Server=localhost;Database=MyDB;Integrated Security=true;" --dacpac "Database.dacpac"
[Deployment executes...]
Press any key to exit...
```
### Command-Line Mode
### Basic Deployment with DACPAC Only
```powershell
DBD.exe `
--connection "Server=localhost;Database=MyDatabase;Integrated Security=true;" `
--dacpac "path/to/MyDatabase.dacpac"
```
### Scripts-Only Deployment (No DACPAC)
Run migration scripts without deploying a DACPAC:
```powershell
DBD.exe `
--connection "Server=localhost;Database=MyDatabase;Integrated Security=true;" `
--post-deploy "path/to/scripts/post-deploy"
```
Or with both pre and post-deploy scripts:
```powershell
DBD.exe `
--connection "Server=localhost;Database=MyDatabase;Integrated Security=true;" `
--pre-deploy "path/to/scripts/pre-deploy" `
--post-deploy "path/to/scripts/post-deploy"
```
### Full Deployment with Pre/Post Scripts
```powershell
DBD.exe `
--connection "Server=localhost;Database=MyDatabase;Integrated Security=true;" `
--dacpac "path/to/MyDatabase.dacpac" `
--pre-deploy "path/to/scripts/pre-deploy" `
--post-deploy "path/to/scripts/post-deploy"
```
### Generate Deployment Script Without Executing
```powershell
DBD.exe `
--connection "Server=localhost;Database=MyDatabase;Integrated Security=true;" `
--dacpac "path/to/MyDatabase.dacpac" `
--generate-script `
--script-path "output/deploy.sql"
```
### Advanced Options
```powershell
DBD.exe `
--connection "Server=localhost;Database=MyDatabase;User Id=deploy;Password=***;" `
--dacpac "MyDatabase.dacpac" `
--pre-deploy "scripts/pre-deploy" `
--post-deploy "scripts/post-deploy" `
--block-data-loss false `
--drop-objects true `
--backup true `
--timeout 600 `
--verbose
```
## Command-Line Options
| Option | Required | Default | Description |
|--------|----------|---------|-------------|
| `-c, --connection` | Yes | - | SQL Server connection string |
| `-d, --dacpac` | No | - | Path to DACPAC file |
| `--pre-deploy` | No | - | Path to pre-deployment scripts directory |
| `--post-deploy` | No | - | Path to post-deployment scripts directory |
| `--block-data-loss` | No | true | Block deployment if data loss detected |
| `--drop-objects` | No | false | Drop objects in target not in source |
| `--ignore-permissions` | No | true | Ignore permissions during deployment |
| `--ignore-roles` | No | true | Ignore role membership |
| `--ignore-user-settings` | No | true | Ignore user settings objects |
| `--backup` | No | false | Backup database before changes |
| `--generate-script` | No | false | Generate deployment script only |
| `--script-path` | No | - | Output path for generated script |
| `--timeout` | No | 300 | Command timeout in seconds |
| `--continue-on-error` | No | false | Continue on script errors |
| `-v, --verbose` | No | false | Enable verbose logging |
## Project Structure
```
DatabaseDeployer/
├── src/
│ ├── DatabaseDeployer.Core/ # Core deployment logic
│ │ ├── DacpacDeployer.cs # DACPAC deployment orchestration
│ │ ├── ScriptRunner.cs # SQL script execution engine
│ │ ├── ScriptTracker.cs # Script execution tracking
│ │ └── DeploymentOptions.cs # Configuration model
│ ├── DatabaseDeployer.Cli/ # Command-line interface
│ │ ├── Program.cs # Entry point
│ │ ├── CommandLineOptions.cs # CLI argument parsing
│ │ └── log4net.config # Logging configuration
│ └── DatabaseDeployer.Tests/ # Unit tests
└── examples/
└── scripts/
├── pre-deploy/
│ └── 001_CreateBackup.sql
└── post-deploy/
└── 002_MigrateData.sql
```
## Script Organization Best Practices
### Naming Convention
Use numeric prefixes to control execution order:
```
scripts/
├── pre-deploy/
│ ├── 001_DisableTriggers.sql
│ ├── 002_BackupCriticalData.sql
│ └── 003_ArchiveOldRecords.sql
└── post-deploy/
├── 001_MigrateCustomerData.sql
├── 002_UpdateReferenceTables.sql
└── 003_EnableTriggers.sql
```
### Script Types
**Pre-Deploy Scripts** - Use for:
- Database backups
- Data archival
- Disabling triggers/constraints
- Pre-migration data transformations
**Post-Deploy Scripts** - Use for:
- Data migrations
- Reference data updates
- Re-enabling triggers/constraints
- Post-deployment validations
## Integration with CI/CD
### Azure DevOps Pipeline Example
```yaml
- task: PowerShell@2
displayName: 'Deploy Database'
inputs:
targetType: 'inline'
script: |
./DBD.exe `
--connection "$(ConnectionString)" `
--dacpac "$(Build.ArtifactStagingDirectory)/Database.dacpac" `
--pre-deploy "$(Build.SourcesDirectory)/database/scripts/pre-deploy" `
--post-deploy "$(Build.SourcesDirectory)/database/scripts/post-deploy" `
--verbose
```
### GitHub Actions Example
```yaml
- name: Deploy Database
run: |
./DatabaseDeployer.Cli \
--connection "${{ secrets.CONNECTION_STRING }}" \
--dacpac "./artifacts/Database.dacpac" \
--pre-deploy "./database/scripts/pre-deploy" \
--post-deploy "./database/scripts/post-deploy"
```
## How DACPAC Deployment Works
Instead of manually scripting `ALTER TABLE` and `CREATE INDEX` statements, you:
1. Define your database schema in a SQL Server Database Project (.sqlproj)
2. Build the project to generate a `.dacpac` file
3. DatabaseDeployer compares the dacpac to the target database
4. It generates and executes only the necessary changes
5. Dependencies are automatically resolved (tables before views, etc.)
This is **declarative deployment** - you describe what the database *should look like*, not how to get there.
## Deployment Tracking
DatabaseDeployer creates a `DeploymentScripts` table to track script executions:
```sql
CREATE TABLE [dbo].[DeploymentScripts]
(
[Id] INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
[ScriptName] NVARCHAR(500) NOT NULL,
[ScriptType] NVARCHAR(50) NOT NULL, -- 'PreDeploy' or 'PostDeploy'
[ExecutedDate] DATETIME2 NOT NULL,
[ExecutedBy] NVARCHAR(256) NOT NULL,
[Status] NVARCHAR(50) NOT NULL, -- 'Success' or 'Failed'
[ErrorMessage] NVARCHAR(MAX) NULL,
[DurationMs] BIGINT NULL
);
```
Scripts that have been successfully executed are automatically skipped on subsequent deployments.
## Building a DACPAC
You can create DACPACs using:
1. **Visual Studio SQL Server Database Project** - Right-click project > Build
2. **MSBuild** - `msbuild MyDatabase.sqlproj /p:Configuration=Release`
3. **SqlPackage** - Extract from existing database: `sqlpackage /Action:Extract /SourceConnectionString:"..." /TargetFile:Database.dacpac`
## Error Handling
- By default, deployment stops on first error
- Use `--continue-on-error` to execute remaining scripts after failures
- All errors are logged to console and log file
- Failed scripts are recorded in the tracking table
## Logging
Logs are written to:
- **Console** - Real-time deployment progress
- **File** - `logs/DatabaseDeployer.log` with rolling file appender
Configure logging in `log4net.config`
## License
MIT License - Use it, abuse it, make it your own.
## Contributing
This is a reusable tool. Fork it, extend it, make it better. No bureaucracy, just code.
---
**Built with anger and caffeine by developers who are tired of brittle database deployment scripts.**

Comments