Friday, December 28, 2007

How to job schedule or batch control?

Any IT system administration has a need for some automation, batch control, job scheduling or whatever you want to call it. Such can be setup with cronjobs, at-jobs or scheduled task setups, most likely on the server where the job/application must run.

MSSQL 2005 maintenance plans have the option of running off one server, but executing on another. Similar option should be present for schtasks on Windows server, but as with MSSQL i have not tried it, I have always executed everything on the local machine where the schedule is setup.

Some issues with this standard job scheduling setup will come up as you go along, say you want to know either of the following:
  • How the execution went?
  • What is executing right now?
  • What was the standard output of a previous run?
  • How long time did the previous jobs take?
  • Did a job finish before a certain time or within a certain lenght of runtime?
  • This job should only run if these first jobs have finished properly.

All this and more seems like valid points in any normal IT administration. Some of the things I quite often want to do is:

  • Add new onetime only jobs, that need to run just once, eg. execute a script that creates a user, deletes a user, or stop/start a service, etc. etc.
  • Add new permanent jobs, keeping history of changes in start time etc.
  • Handle schedules of database servers, such as MSSQL, MySQL and Oracle.

More of what I would like in a batch control/job schedule system:

Must have:

  • Setup applications with several jobsteps consisting of commandlines
  • Jobs must be startable at certain times
  • Keep track of history of executions, including time, returncodes etc.
  • Jobs and applications must be run based on other application dependencies
  • Timeouts and alternative actions, eg. alarms(email etc)
  • Gui for monitoring batch progress
  • Joboutput (standard output) must be available central

Nice to have:

  • Load evaluation/weight of nodes, deciding where to send jobs.
  • Failover execution if worknodes fail certain checks (node health check support)
  • Eliminate the need for a central control server. Most nice would be all nodes to be aware of every other node, allow failover and pick up new nodes if they come alive again.

When looking into what systems can do this I get caught up in a mix of grid computing, load balancing and job/batch scheduling:


There are 31 pages in this section of this category.

Job scheduler
B
Batch queue
BatchMan
BatchPipes
Batchman
C
CONTROL-M
Command queue
Condor High-Throughput Computing System
Cronacle
G
Grid MP
H
IBM Houston Automated Spooling Program
I
IBM 2780/3780
IBM Tivoli Workload Scheduler
IBM Tivoli
Workload Scheduler LoadLeveler
J
Job Control Language
Job Entry
Subsystem 2/3
L
Load Sharing Facility
M
Maui Cluster Scheduler
Moab Cluster Suite
O
Open Source Job Scheduler
P
PTC
Scheduler
P cont.
Portable Batch System
R
RTDA Network Computer
Remote Job Entry
Retriever Communications
S
S-graph
SAP
Central Process Scheduling
SHARCNET
Sun Grid Engine
U
Unicenter
Autosys Job Management
X
Xgrid

Currently we are using TWS, and a homemade system which can do all we need, plus is extendable. Of course TWS is something we are forced to use, the other system would be just fine.

Of course I will be limited to open source or free systems, so I came up with these few systems I would like to try out:

  • TORQUE is opensource and support available, that is nice for the enterprise. Torque is available for FreeBSD via ports, even very actively maintained.
  • Sun Grid Engine is a batch queueing system implementing a superset of the
    functionality of the POSIX batch queueing framework. Also in FreeBSD ports.

On a side note, i stumbled upon a cluster admin article for unix with ssh, where cssh is suggested, but with much more in comments on http://www.debian-administration.org/.

Tuesday, December 25, 2007

Snort - what can you do

Taosecurity heads up on his 11th Snort report which is a good NSM read for most Snort administrators or just NSM interested IT security technician. Reading his books will also get you the idea of a NSM approach :-)

Some snips:

"How do I make Snort log sessions/flows?" It's inspiring to see such faith in Snort, but such questions indicate a certain amount of tool-fixation.

Snort can operate in two modes: active and passive. Snort can be active either inline or offline:
  1. In an active, inline mode, Snort acts as an intrusion prevention system (IPS)...
  2. In an active, offline mode, Snort acts as a quasi-IPS...
  3. In passive, inline mode, Snort sits physically on the wire and allows all traffic to pass...
  4. ... passive, offline mode... watches traffic provided by a network tap or switch SPAN port.... is the most popular...
...The following is a transcript generated from Sguil. The data was collected by a second instance of Snort running in pure Libpcap packet logging mode. The content was built using Tcpflow. The operating system fingerprinting was done by P0f...

... This very short example hints at the real power of Snort. I tend to see Snort as a pointer to activities that require additional inquiry. A Snort alert should be the beginning of an investigation, not the end.


Yes it is the NSM story, I like it of course :-)

Oh by the way, I can only agree with the problem of tool fixation. Tools does not solve problems, although many think so still. It requires much more :-) Related to this problem is mis-usage and security by belief (instead of fact) due to systems being setup and operated by "make install".

Tuesday, December 18, 2007

Windows scheduled tasks, backup/restore/administer

Recently I had to make an analysis of scheduled tasks on about 50 servers, mixed Windows 2000 and 2003. Some of the tasks was to be recreated on new Windows 2003 servers. Same project as the shares analysis.

I first turned to schtasks.exe which can be used for query (and create on 2003) , for example:
schtasks.exe /S server /delete /f /tn "calc"
schtasks.exe /S server /CREATE /SC ONSTART /TN "calc" /TR "command" /RU:"domain\user" /RP:pass
schtasks.exe /S server /run /tn "calc"

The query output gives information that can be parsed, eg:
schtasks /query /v /fo table
HostName TaskName Next Run Time Status Last Run Time Last Result Creator Schedule Task To Run Start In Comment Scheduled Task State Scheduled Type Start Time Start Date End Date Days Months Run As User Delete Task If Not Rescheduled Stop Task If Runs X Hours and X Mins Repeat: Every Repeat: Until: Time Repeat: Until: Duration Repeat: Stop If Still Running Idle Time Power Management

server calc Never 16:30:00, 12-12-2007 0 user At 16:30 every Mon, Tue, Wed, Thu, Fri of every week, starting 06-12-2007 C:\WINDOWS\system32\calc.exe calc.exe N/A Disabled Weekly 16:30:00 06-12-2007 N/A MONDAY,TUESDAY,WEDNESDAY,THURSDAY,FRIDAY N/A runasdomain/user Enabled 72:0 Disabled Disabled Disabled Disabled Disabled Disabled

But as schtasks.exe does not work on Windows 2000 I turned to jt.exe from Windows 2000 resource kit:
ftp://ftp.microsoft.com/reskit/win2000/jt.zip
3104f01eb01ce8b482bf895db60d7e8e jt.exe

I looked at some jt.exe examples, and created a parser in perl. The basic usage of jt.exe was pretty much limited to:
joblist from: jtbin /sm \"$myserver\" /se p
credentials: jtbin /sm \"$myserver\" /sac \"$jobname\" /gc

Here are some more examples of create commands, generated from parsing the jt.exe output:
Example of mon-fri 8-18, every minute:
schtasks.exe /create /SC WEEKLY /RI 1 /ST 08:00 /ET 18:00 /D MON,TUE,WED,THU,FRI /TN "task" /TR "cmd" /RU:domain\user
Every morning, mon-fri:
schtasks.exe /CREATE /SC Weekly /D MON,TUE,WED,THU,FRI /ST 07:00:00 /TN "task" /TR "command"

Later i found that i can patch SCHTASKS.EXE for Windows 2000 usage, and i turned out to actually work perfect. But i had already used jt.exe output for parsing, and it did do everything i needed. Here are the checksums of the files i tested patching with:

4D918C96C3306DF5F460801437BF24FC schtasks_w2k_5.1.2600.2180_patched.exe 86E33A8D9174DB2DB5001D0FD5DCFB8D schtasks_w2k3_5.1.2600.2180_orig.exe

Some of the problems i have or had while working with scheduled tasks:

Parsing more that the first trigger for a task.

How to make create a task or modify the default task property: "Stop Task If Runs X Hours and X Mins: 72:0". This is a problem if the task is created as ONSTART, but we want it to keep running for ever.

Worked around this by calling a cmd wrapper so the task it self is not running, but a wrapper which loops.

I did not try using "jt /? /sj" option which might be what I needed:

/SJ - set task's properties

Change one or more properties on the in-memory task object.

...

MaxRunTime = (in milliseconds)

...

Example: /sj command = notepad.exe Priority=idle DeleteWhenDone=1

How to make sure a schtask program is started in session 0?

That is, if a terminal service session 1 or 2 exists, the remote schtask /run command will sometimes(not always) start the program in session 1 or 2, which is not always what we want.

Only workaround was to manually logging into terminal services /console and starting task.

So this problem is not solved :-)

If the task is set for ONSTART it will of course start in session 0 if you reboot the server.

If there is a session 1 or 2, it does not work to use psexec eg. like this:

psexec \\server -i 0 -e cmd /C "schtasks.exe /RUN /TN calc"

Psexec -i 0 (default) and -i 2 works fine if it is not a scheduled task that is started:
psexec \\server -d -e calc.exe
psexec \\server -d -i 2 calc.exe


The jt /? /sj does not seem to have a property for what session a scheduled task starts in:

The property list has the form = ...

The task properties and the form of their values:

ApplicationName =
Parameters =
WorkingDirectory =
Comment =
Creator =
Priority = { Idle Normal High Realtime }
MaxRunTime = (in milliseconds)
Idle = (wait & deadline, in minutes)
Interactive = { 1 0 }
DontStartIfOnBatteries = { 1 0 }
KillIfGoingOnBatteries = { 1 0 }
RunOnlyIfLoggedOn = { 1 0 }
SystemRequired = { 1 0 }
DeleteWhenDone = { 1 0 }
Suspend = { 1 0 }
HaltOnError = { 1 0 }
StartOnlyIfIdle = { 1 0 }
KillOnIdleEnd = { 1 0 }
RestartOnIdleResume = { 1 0 }
Hidden = { 1 0 }
TaskFlags = (in decimal)

- must be surrounded by double quotes if it contains spaces
- { m/d/y TODAY }
- any integer

Case is not significant (i.e., IDLE and Idle are both legal).

Verify computers health before allowing network access

The topic will be interesting to any Windows administrator who worries about what client computers are allowed on the network. I could imagine that many people will have created their own ways of checking, for example before dhcp gives an ip, or blackholing ips if traffic or status of a machine fails checks.

With Network Access Protection (NAP) in Windows Server 2008 there is a new possibility.

Some quotes and hype:
  • Administrators can enfore policies with NAP, eg. placing clients that fail requirements in quaratine(limited access) or with no access.
  • Using NAP with DHCP lets you protect all NAP capable clients that get network access from DHCP including Wifi and lan computers.
  • Windows XP SP3 will include NAP client software. Vista has it by default. Nap client software for XP beta 3 will XP SP2 NAP capable.
  • NAP is not limited to Microsoft, the system just has to provide the NAP server with its health state. Example: missing!

To use NAP for DHCP you must perform these tasks: (Remember these are just some snips from Windows IT Pro november 2007 issue).

  • Prepare environment: must have AD with one or more 2003(or 2008) DCs. Must have DHCP on a 2008 machine, eg. a member server in the domain. Open server manager and add Network Policy Server(NPS) which replaces 2003s Internet Authentication Server(IAS). etc etc
  • Configure health policies: in the NPS console, configure the System Health Validator (SHV) to the client requirements you have. Configure the Health Policy options, select new and check the SHV's you want to use and if they must eg. pass all SHV checks to be considered healthy, eg. automatic update on, hotfixes installed, firewall on, etc etc. Also create a new health policy for clients to be considered non-compliant/unhealthy. etc etc
  • Create network policies for NAP: in the NPS console setup Network Policies to specify what network access that will apply to eg. unhealthy clients. etc etc
  • Configure DHCP for NAP: configure one group of scope options for compliant NAP clients and one scope for incompliant clients. Go to properties of the scope in the DHCP console, enable for this scope in the Network Access Protection tab.
  • Enforce NAP on the client side: use the NAP client console, group policies or netsh (which has new NAP context). You can edit GPOs from Vista or 2008 Group Policy Management Console (GPMC). Start the Network Access Protection Agent service, and automatic of course. On Vista there is a mmc, napclcfg.msc. Netsh command is: netsh nap client set enforcement ID = 79617.
  • Run a NAP test and check how you can notice if some clients fail. You will probably get a call from the client owner who can not get online.
Btw, Windows 2003 SP1 already had some Network Access Quarantine (NAQ) that helps administrators limit of deny connections to computers that dont comply with a companys security policies. However there are some problems with NAQ:
  • Only works with VPN, leaving wifi and normal lan connections out of the game!
  • NAQ is based on scripts that run on the client, which can be hard to create for every firewall or antivirus software you want to check
  • After NAQ check is completed, the user can disable firewall or antivirus, it will not be detected, and level of access remains the same.

Of course NAP replaces NAQ:

NAP is essentially the replacement for Network Access Quarantine Control and the long-term solution for customers. Microsoft anticipates that partners will provide services and solutions to assist customers with the maintenance of their existing investment or the update of their networks for NAP.

For a detailed comparison of NAP with Network Access Quarantine Control in Windows Server 2003, see Network Access Protection Platform Overview.

So NAP seems like another tool in the box of Windows network administration, just like WSUS is.

Sunday, December 16, 2007

When Vista?

At my work there is a rumor we will switch to Vista by the end of 2008. It might seem far away, but actually I think it is too soon!

I have tried Vista at home, but I skipped it for my XP again! And that was a machine i used for entertaining, multimedia and such. So I really fear being forced to use Vista for getting work done!! It will happen of course, but I hope it wont be soon!!

For a good laugh, read the Upgrade to Windows XP :-) Also on /.

Flickr statistics and Picnik editing

Finally Flickr added stats for pro accounts, thank you :) Flickr statistics was missing so its nice to see it in action! I would like to be able to go back in time, maybe that will come, so far its a good start!

And with Picnik picture editing in place there is no chance I am leaving Flickr anytime soon! Of course I prefer Google services for most anything else ... youtube, calendar, e-mail, documents and blogging of course :-)

Saturday, December 15, 2007

Encrypted filesystems solutions

I recommed reading the monthly CRYPTO-GRAM, it always has interesting stories from real life security, and not just IT related. Well worth subscribing to. It is often long, but a very good security round up of the month!

This month CRYPTO-GRAM had some nice reflections on disk encryption. Still relevant even after so many years of one story after another where personal data is lost, this latest is no exception!

So it should be no surprise that many people and companies still dont use disk encryption in some form or the other, but it is sad.

Some quotes:
Computer security is hard. Software, computer and network security are all ongoing battles between attacker and defender. And in many cases the attacker has an inherent advantage: He only has to find one network flaw, while the defender has to find and fix every flaw.
...
There are several whole-disk encryption products on the market. I use PGP Disk's Whole Disk Encryption tool for two reasons. It's easy, and I trust both the company and the developers to write it securely. (Disclosure: I'm also on PGP Corp.'s Technical Advisory Board.)

Setup only takes a few minutes. After that, the program runs in the background. Everything works like before, and the performance degradation is negligible. Just make sure you choose a secure password -- PGP's encouragement of passphrases makes this much easier -- and you're secure against leaving your laptop in the airport or having it stolen out of your hotel room.
I am missing whole disk encryption on some of my computers, so i will look into that. On my Macbook i use Filevault.
There are other encryption programs out there. If you're a Windows Vista user, you might consider BitLocker. This program, embedded in the operating system, also encrypts the computer's entire drive. But it only works on the C: drive, so it won't help with external disks or USB tokens. And it can't be used to make encrypted zip files. But it's easy to use, and it's free. And many people like the open-source and free program, TrueCrypt. I know nothing about it.
I prefer TrueCrypt on Windows (didnt work on *nix when i tried a while back), having all sensitive data inside containers. On FreeBSD i use GEOM Based Disk Encryption (gbde) and EncFS.

An interesting twist and point to take note of, is if you are forced to type in your password. By authorities or criminals:
And some countries -- the United Kingdom, Singapore, Malaysia -- have passed laws giving police the authority to demand that you divulge your passwords and encryption keys.
...
Failing that, you can try to convince the authorities that you don't have the encryption key. This works better if it's a zipped archive than the whole disk. You can argue that you're transporting the files for your boss, or that you forgot the key long ago. Make sure the time stamp on the files matches your claim, though.
...
The best defense against data loss is to not have the data in the first place.
You really dont need to walk around with all kind of data, so dont!

Friday, December 7, 2007

IT security, determine your score of the game

I am not sure why I missed a really good post at Taosecurity, maybe it was the size of the post and me being tired when going over his blog. This post is very important when thinking about IT security, so once again I remind myself to keep reading Taosecurity, even if I am tired :-)

Anyway, some of the key viewpoints, some new to me, some not:

... don't think your security responsibilities end when the bottle is broken against the bow of the ship and it slides into the sea. You've got to keep watching to see if it sinks, if pirates attack, how the lifeboats handle rough seas, and so forth.

And there is an excellent list of suggestion for how to determine your enterprise "score of the game," and use that information to decide what you need to do differently.

Here are some headlines from the list:

  1. Standard client build client-side survival test. Create multiple sacrificial systems with your standard build. Deploy a client-side testing solution on them, like a honeyclient
  2. Standard client build server-side survival test. Create multiple sacrificial systems with your standard build. Deploy them as a honeynet.
  3. Standard client build client-side penetration test. Conduct my recommendation penetration testing activities and time the result.
  4. Standard client build server-side penetration test. Repeat number 3 with a server-side flavor.
  5. Standard server build server-side penetration test. Repeat number 3 against your server build with a server-side flavor.
  6. Deploy low-interactive honeynets and sinkhole routers in your (internal) network. These low-interaction systems provide a means to get some indications of what might be happening inside your network.
  7. Conduct automated, sampled client host integrity assessments. Select a statistically valid subset of your clients and check them using multiple automated tools (malware/rootkit/etc. checkers) for indications of compromise.
  8. Conduct automated, sampled server host integrity assessments. Self-explanatory.
  9. Conduct manual, sampled client host integrity assessments. These are deep-dives of individual systems. You can think of it as an incident response where you have not had indication of an incident yet.
  10. Conduct manual, sampled server host integrity assessments. Self-explanatory.
  11. Conduct automated, sampled network host activity assessments. ... The idea is to let your NSM system see if any of the traffic it sees is out of the ordinary based on algorithms you provide.
  12. Conduct manual, sampled network host activity assessments. This method is more likely to produce results. Here a skilled analyst performs deep individual analysis of traffic on a sample of machines (client and server, separately) to see if any indications of compromise appear.

In all of these cases, trend your measurements over
time...

Don't slip into thinking of inputs. Don't measure how many hosts
are running anti-virus. We want to measure outputs. We are not proposing new
controls.

Key phrases: manual vs. automated and server vs. client, and proactive investigation.

Most of the info has been on his blog before, but all toghether yet another great post :-)

Adminstrating what your DNS queries are: OpenDNS

It seems like an obvious win for your client network security, when it comes to visiting malicious hostnames: use an DNS server which denies certain hostnames based on some Realtime Block Lists (RBL). Similar to what can be used in parsing e-mails for spam points!

I read several places about OpenDNS, a great free DNS provider who does exactly what you would like, even with added administration to remove blacklists, see top queries etc. And they continue to improve the service and administration dashboard.

So check it out if you are administrating a client network intranet for example. Perhaps its is a bit too far using it for your servers :-)

Starting some PowerShell notes

For a while there was hype about Microsofts new scripting shell, it was referred to as Monad or MSH, now it is called PowerShell.

A good place to start is at Rob van der Woude's scripting pages:

Getting started:
Download and install
Windows PowerShell 1.0 RtW and .NET Framework 2.0 RTM and the Windows PowerShell 1.0 Documentation Pack.
You'll need to uninstall older versions of PowerShell first.

...

PowerShell Links:
Windows PowerShell Quick Start

Here are some notes and tips from Windows IT Pro november issue:

Powershell uses a new set of commands called cmdlets and a new syntax.
Help: Get help with the get-help command.
CD: you can change to registry key: cd hklm:\software
Get-Alias cmdlet is gal, eg. list all aliases: gal select name, definition
Get-Command to see the many commands available, eg: get-command get*
Set-Content to write values to a file: sc c:\f.txt -value "Hi"
Get-Content to read contents of a file: gc c:\f.txt
Set-ExecutionPolicy: by default powershell can not run scripts, you can only enter commands at the command line. To enable run scripts: set-executionpolicy unrestricted
Set-PsDebug: for example step through one line at a time, set-psdebug -step
Get-Process: you can list all running processes: get-process
Get-Eventlog: for example: get-eventlog -newest 10 -logname system

I think I wont get started with Powershell for real until Windows 2008 / Exchange 2007 or similar is being used somewhere close to where i do my administration :-)

Friday, November 30, 2007

Windows shares and NTFS file permissions, show/create/modify

Recently I had to make an share analysis of about 50 servers, mixed Windows 2000 and 2003. The shares was to be recreated on a new set of servers, including a change for some to Windows cluster server shares.

There turned out to be at least several possibilities:

I went down the VBS script path, and it worked out fine, created a bunch of command oneliners I could use on the new servers or on the new clusters, eg:

cluster . res "share" /priv security="domain\group",grant,F:security
cluster . res "share" /priv security="domain\user",grant,R:security
net share="d:\path\to\share" /GRANT:"domain\user1",READ /GRANT:"domain\user2",FULL

The net share command creates the share, but on the cluster share was created with a wrapper script was made from a Microsoft example, only changing ShareSubDirs=0. Then the above cluster command works fine.

The problem with the script method was that if there was no ACL for a share, my script did not list the share. And i didnt make the script query remote servers, so i used a little psexec workaround in the scripts:
copy listshares.vbs \\server\d$\
psexec \\server -e cmd /C "cscript d:\listshares.vbs"
psexec \\server -e cmd /C del d:\listshares.vbs

Anyway, in the future I recommend using RMTSHARE.EXE which works fine on 2000/2003/xp, can query shares remote, modify permissions, create and all I need. Some examples:

List shares: RMTSHARE \\server
List permissions of a share: RMTSHARE \\server\share /users
Add a user to a share remote: RMTSHARE \\server\share /grant "domain\user":F
Revoke a user permissions: RMTSHARE \\server\share /grant "domain\user"

By the way, note that "net share" command is different on Windows 2003 and on XP. There are permissions options on the Windows 2003 version:
The syntax of this command is:

NET SHARE
sharename
sharename=drive:path [/GRANT:user,[READ CHANGE FULL]]
[/USERS:number /UNLIMITED]
[/REMARK:"text"]
[/CACHE:Manual Documents Programs None ]
sharename [/USERS:number /UNLIMITED]
[/REMARK:"text"]
[/CACHE:Manual Documents Programs None]
{sharename devicename drive:path} /DELETE


There is no permission option on the XP version:
net share /?
The syntax of this command is:


NET SHARE sharename
sharename=drive:path [/USERS:number /UNLIMITED]
[/REMARK:"text"]
[/CACHE:Manual Automatic No ]
sharename [/USERS:number /UNLIMITED]
[/REMARK:"text"]
[/CACHE:Manual Automatic No ]
{sharename devicename drive:path} /DELETE

For NTFS file permissions setting, remove and modify, I use XCACLS.VBS, which can do all we need. It also works on the clusters. Some examples:

Listing access, if you want subdirs add /s /t:
cscript c:\bin\XCACLS.vbs d:\dat\ /server server

Give access, with /e so other users are left as they were:
cscript c:\bin\XCACLS.vbs d:\dat\ /e /g "domain\user":F /server server

Revoke (/r) example, remote: !!! WARNING !!! remember the /e or every permission will be gone:
cscript c:\bin\XCACLS.vbs d:\dat\ /e /r "domain\user" /server server

My only problem with XCACLS.VBS so far, is that it it truncates output of the users, so its hard to wrap into a script for recreation. Eg, it shows only "Domain\Some_domain_gruo" below and not the full groupname:

"Allowed Domain\Some_domain_gruo Modify..."

Ideas for solving this are very welcome :-)

Thursday, November 29, 2007

More FreeBSD 7 goodies

As if the binary upgrade posibilites in FreeBSD 7 (and 6 to 7 if you like) was not enough (and actually working), there are plenty of goodies to look forward to:
SQL database performance ... MySQL 5.0.45 (thread-based)
New filesystems ... ZFS
Network stack changes...Complete elimination of giant lock from network stack
Intel wireless drivers: ... iwi (2200BG/2225BG/2915ABG)...Works out of the box
Atheros protocol extensions...802.11n support (forthcoming standard)...I higher performance: up to 135 Mb/sec
Security subsystems...Audit subsystem... Fine-grained, configurable logging of security-relevant events...System calls, application and user space activities
Performance ... If you find a workload that FreeBSD 7.0 performs poorly on, we want to hear about it!
IPMI (Intelligent Platform Management Interface); monitoring
system hardware
Oh and then some teasers of what to expect in the horizon:
FreeBSD 8.0-CURRENT, due some time in 2009 (maybe)

Some of the features that seem to be lurking on the horizon:
Continued performance optimization, also targetting 16-core
systems (AMD/Intel)
Improved network performance on parallel workloads
Improved filesystem performance
Virtualization support: xen, network stack virtualization, ...
BLUFFS: BSD Logging Updated Fast File System. UFS with
filesystem-level journalling.
Serial Attached SCSI, SATA integrated under CAM (storage
layer also used for SCSI)
DTrace support from Sun; powerful and extensible debugging
and system analysis framework
Stuff we haven't even thought of yet!

I wish i could use FreeBSD for more of my everyday work hehe... :-)

UPDATE: O'Reilly ONLamp had a really great article with loots of details of Whats New In FreeBSD 7.0.

Sysadmin sites to include in your own searchengine crawl

During the everyday life of a sysadmin Google plays a large role, but also the internal knowledge base is important as there are (should be!) cases related to your specific systems. So hopefully you are providing search for that internal knowledge, or it could go unused!!

I am thinking of collecting a set of external sites to include in our internal search engine crawling, as those sites seems to pop up again and again.

I will build a list of sites to include here, bare in mind this is a raw list, i will update it when they are actually put in the search crawler!

Windows adminstrator/script related so far:
http://www.jsifaq.com/
http://www.windowsitpro.com/topics/index.cfm?action=ArticleList&ChildTopicID=72
http://www.windowsitpro.com/Articles/ArticleID/14459/14459.html?Ad=1
http://www.windowsitpro.com/windowsnt20002003faq/
http://www.ss64.com/nt/
http://www.computerperformance.co.uk/vbscript/
http://www.robvanderwoude.com/
http://cwashington.netreach.net/
http://www.ericphelps.com/batch/index.htm
...
From ss64.com link page there are many *very* good sites, a lot I didnt know before, here some snips:

CommandLine.co.uk - Batch File examples and Utilities
FP Schultze - Batch files
OldNewThing - Raymond Chen's weblog

Heise-security.co.uk - Manage Win XP updates without an internet connection.
Timo Salmi - FAQ's - Useful NT/2000/XP script tricks and tips (tscmd)
Steve Hardy - NT/2K command line scripting
Rick Lively - Commands for every version of Windows and DOS
List of TCP and UDP port numbers

Joeware.net - Admin, AD and Exchange tools.
FP Westlake - Free Win32 console utilities.
Alexander Frink - NT Security Utils, Logoff, Change password.
Bill Stewart - Batch script and Windows admin tools.

Poor Mans SMS - scan a pre defined IP range and list all installed software.
Microsoft App Compatibility - command line tool to collect application info.

Agent Ransack - File Search for Win XP
AnalogX - Screen capture, Terminal Server copy, etc

Autohotkey - Automate keystrokes, mouse clicks.
AutoIT - GUI scripting
Bamboo Software - Scheduled Tasks and other command line utils.
DumpSec, DumpEvt - Dump Event Log, Registry or Security info.
OCS Inventory - Open Source System Management
Filezilla - FTP
Lost NT password
NTFS undelete - undelete files
nu2/Barts Bootable CDs - Admin/Recovery
Trinity Rescue Kit - for recovery and repair of Windows machines
Netikus - Password, Ping, FTP tools.
OptimumX - Utilities by Marty List
UnDelete - Diskeeper

And perhaps:

http://www.microsoft.com/technet/scriptcenter/default.mspx

FreeBSD sysadmin so far:
http://taosecurity.blogspot.com/
...

Windows 3GB limit and applications using > 2GB

Windows servers with more than 3 GB ram should have a special setting in their boot.ini. This is examples where %systemroot% is c:\winnt\ even on Windows 2003:
[boot loader]
timeout=3
default=multi(0)disk(0)rdisk(0)partition(1)\WINNT
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINNT="Windows Server 2003, Enterprise"
/noexecute=optout /fastdetect /3GB /PAE

[boot loader]
timeout=3
default=multi(0)disk(0)rdisk(0)partition(1)\WINNT
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINNT="Microsoft Windows 2000 Advanced Server" /fastdetect /PAE /3GB
Also after booting with this setting, check your application is actually enabled to use more than 2 GB mem, that is, if you want it to :-) You should be able to enable an application to use more than 2 gb. Microsoft has a nice description:
You can use the Imagecfg.exe file to provide selective use of application memory
tuning in Microsoft Windows 2000. Executable files that can access the 3-GB
address space must have the IMAGE_FILE_LARGE_ADDRESS_AWARE bit set in their image headers. You can set this bit by using the Imagecfg.exe utility; this
utility is included on the Windows 2000 Server Supplement One Resource Kit
CD-ROM . For example, to modify an executable file that is named Test.exe, use
the following command syntax:
Imagecfg -l test.exe

You can check an exe file by running imagecfg test.exe and look for this string:
Image can handle large (>2GB) addresses

For your reference my copy of imagecfg.exe has this info:
5.0.1556.1
835A3281EAC25F18B9A859F68776F167 imagecfg.exe


Of course this will not be a problem when everyone is running 64 bit, which will happen sooner or later. As you might know Windows Server 2008 is the last version to support 32 bit.

Thursday, November 15, 2007

Getting 750 GB SATA drives working

When i got some new 750 Gb drives, I attached them to my standard SATA controller where i had two 300 Gb SATA drives, but then my computer would not start! It did not help to limit the disks by jumper setting:


Moving the drives to the Promise Fasttrak controller got the PC booting, but Windows XP couldnt recognize the drives.

I knew i had to update some BIOS or drivers, but i did not know what motherboard i had. So i turned to a friend, he recommended the freeware CPU-Z tool:
http://www.cpuid.com/cpuz.php

CPU-Z produces all kind of info, you can use the gui, or export to file or html, so run it before you upgrade:
cpuz.exe -txt=%computername%-%date%-before_upgrade
I needed the motherboard info model and current BIOS version:
Mainboard Model MS-6702E (0x1E1 - 0xBE28EE)
DMI BIOS
--------
vendor American Megatrends Inc.
version 080011
date 06/08/2005

I entered the model number into MSI CPU support form and got all kind of nice info about drivers and BIOS. But I also saw the LiveUpdate, which i used instead.

After using driver for the SATA and rebooting, the drives added to the Promise controller was visible to XP :-) And it turned out that my bios was uptodate.

I did not need more support from MSI.

So buttom line is, I got the drives working from the non-standard sata controller :-)

Printkey 2000

I got a copy of Printkey 2000 5.10 Full from a friend, he has these md5sums:

93C16AF42A3D508F90AED5CCA1DB5D5B PrintKey.exe
DB4BC1B5BF470886D7C495E2E45C8553 Printkey2000.exe

This way i dont have to rely on some download, which I am not sure is safe:

http://www.zdnet.de/downloads/prg/6/y/de000H6Y-wc.html
3033b0d05c7e37999b4b9644f53785af *prntky.zip



Wednesday, November 14, 2007

SQL queries

Today I made a view with a simple join, its very easy once you get the hang of it:

select b.column-name-1,a.column-name-2 from table1 a, table2 b where
b.somecolumn = a.somecolumn

Tuesday, November 13, 2007

Open source alternatives, for MS Project

I dont think I mentioned osalt.com open source alternatives here before, and today i used it again :-)

It is a great website for your business collegues or management who might not be so familiar with open source and the alternatives available. Please check the osalt sections, and send suggestions to them!

Osalt.com gives you a great overview of commercial software and the alternatives, and even include a list supported operating systems.

Osalt.com does not have everything, as not all great software is open source. My favorite freeware editor pspad is one example, and can not be found on the osalt ultraedit alternative list.

Recently we needed alternatives to MS Project, or at least a .mpp viewer, as the license costs for MS Project is insane. The export to webpage wizard is just not my friend, so a viewer for my collegues is needed!

I would have used openproj, as that works on Mac and Unix and I really just need a viewer, but it requires JRE > 1.5 which I dont have here at work. Besides it was a beta, and if you really need to work with project management, go for Ganttproject which also is available for Mac.

OpenWorkbench which only needs 1.3.1 or later of Sun's Java Runtime Engine, but beware there has not been a release since december 2005.

If you can live with a shareware MS Project viewer, you might checkout Projette. I dont know it will nag or stop working after some days, so far there has been no problems.

Datestring in batch regardless of regional date setting

A while back I mentioned a collection of advanced batch commands, and today I actually needed the good old env variable %TimeStamp%, so here it is:

@echo off
:: Works on any NT/2K/XP machine independent of regional date settings
FOR /f "tokens=1-4 delims=/-. " %%G IN ('date /t') DO (call :s_fixdate %%G %%H %%I %%J)
for /F "delims=: tokens=1-2" %%i in ('time /t') do (call :settimeenvvars %%i %%j)
goto :s_print_the_date

:s_fixdate
if "%1:~0,1%" GTR "9" shift
FOR /f "skip=1 tokens=2-4 delims=(-)" %%G IN ('echo.^date') DO (
set %%G=%1&set %%H=%2&set %%I=%3)
goto :eof

:settimeenvvars
set hour=%1
set minute=%2
IF 1%hour% LSS 20 SET hour=0%hour%
IF 1%minute% LSS 20 SET minute=0%minute%
goto :eof

:s_print_the_date
set timestamp=%yy%%mm%%dd%
if "%1" == "dateonly" goto :end
set timestamp=%timestamp%-%hour%%minute%

:end
echo %timestamp%


I have mentioned it before, but much inspiration for batch can be found at robvanderwoude.com.

One Windows program I have never had a use for before is c:\windows\system32\attrib.exe, which displays or changes file attributes:

ATTRIB [+R -R] [+A -A ] [+S -S] [+H -H] [drive:][path][filename] [/S [/D]]

+ Sets an attribute.
- Clears an attribute.
R Read-only file attribute.
A Archive file attribute.
S System file attribute.
H Hidden file attribute.
[drive:][path][filename]
Specifies a file or files for attrib to process.
/S Processes matching files in the current folder and all subfolders.
/D Processes folders as well.

Example:
attrib file://servername/d$/%2 -r -s -h

Monday, November 12, 2007

FreeBSD binary upgrades

Finally it looks like there will be a binary upgrade possibility in FreeBSD even for major versions going from 6.x to 7.x. Very cool work, I will definately test it!

An interesting side notes is the recommended method for portupgrade of all ports, it deals with the ruby and ruby18-dbd problems we all know:
Using portupgrade to rebuild everything is a bit tricky since it can get a bit confused when upgrading the programs it uses (ruby and ruby18-bdb), but the following procedure should work:
# portsnap -I update
# portupgrade -f ruby
...
# rm /var/db/pkg/pkgdb.db
# portupgrade -f ruby18-bdb
...
# rm /var/db/pkg/pkgdb.db /usr/ports/INDEX-*.db
# portupgrade -af

Wuala and cryptographic Snake Oil

A very interesting post on the free community based online harddisk project Wuala, which also has a pointer to the very good post on 9 signs you might be dealing with cryptographic Snake Oil:
These snake-oil warning signs are neither necessary nor sufficient criteria for separating the good cryptography from the snake oil. Just as there could be insecure products that don't trigger any of these nine warning signs, there could be secure products that look very much like snake oil. But most people don't have the time, patience, or expertise to perform the kind of analysis necessary to make an educated determination. In the absence of a Food-and-Drug-Administration-like body to regulate cryptography, the only thing a reasonable person can do is to use warning signs like these as guides.
All is of course recommended reading :-)

As for Wuala I wouldnt mind giving it a try, I just dont have use for it right now.

Sitemeter stastistics

When I started writing here I added Google Analytics to the blog, it works really well.

In the past I have been very happy with some simple web statistics like Webalizer and AWStats, so now I have added something similar to those: Sitemeter.

Sitemeter Basic is free, it is simple, plus it can send you stat overview by e-mail. It can also act as a good old counter.

As for added the script code to the HTML on your website it is as simple as for Google Analytics.

Wednesday, November 7, 2007

Windows media on Mac

Just found something I need for my Macbook, to view Windows media in Firefox:

http://www.microsoft.com/windows/windowsmedia/player/wmcomponents.mspx
http://www.flip4mac.com/

There was no plugin for Firefox, so i installed it manually.

Sunday, November 4, 2007

NSM readup, for later use

I am still behind my own schedule for my NSM setup, guess my wife and our newborn (2 months old) is taking up most of my time :-)

Anyway, I want to keep a few pointers to good articles and websites for later. Once again from Taosecurity :-)
Russ McRee followed his excellent discussion of NSM and Sguil in the October InfoSecMag with a new article called Argus – Auditing network activity (.pdf), published in the November 2007 ISSA Journal. It's another great read.

UPDATE 1:
Great NSM demo from Taosecurity, using session analysis and full content, basically perfect for education: http://taosecurity.blogspot.com/2007/11/analyzing-protocol-hopping-covert.html


UPDATE2:
Taosecurity again of course: What is NSM? NSM vs. IDS, with pointer to a slide show from 2002 :-) It still holds water! One of the good ones:
“IDS” is only a product; NSM is an operation
incorporating products, people, and processes

Thursday, November 1, 2007

Get PCI compliance, and become a better administrator and a stronger team

In the spring and summer of 2006 I was part of completing a PCI compliance. This was a great a great experience. We achieved and learned so much from the process, and in a very short amount of time, because we had deadline before we was going to be audited. I can only recommend the process to anyone!

Here is a quick rundown of what we used:
  • Osiris for HIM, on both Windows and FreeBSD. At the time there was no OSSEC.
  • Central syslog.
  • Snort with syslog reporting, also to SMS. We played with Sguil as NSM but it was too much network data for the server we had setup. If I was to improve and redo something, this would be it, a server with more CPU and diskspace for.
  • Improved the FreeBSD (ipfw) and Windows (ipsec) firewall administration by rules being pulled from central CVS server.
  • Nessus 2.x at the time for penetration testing and remote scanning. Later fully automated and reports sent to Subversion for diff, and to to certain e-mail adresses for completeness.
  • Webservers, mailservers, dns servers etc got a security check, there was not much to improve.
  • ClamAV on Windows, antivirus, which does not seem necessary, but it was a demand.
  • All software/webpages and documentation and scripts (setup/upgrade/changes) goes to CVS for ease of diff and review by the different people responsible of the entire setup.
All in all, it was a great experience for myself, and for the team of people involved. It brought us together in a new way while working toward the goal :-)

I am not the only one who is happy about the learning from being PCI compliant. Here are some snips from his experience, it is very similar to my

I'm using OSSEC (http://www.ossec.net) to monitor the individual
SysLog
files for perceived security issues. OSSEC understands Snort, Cisco PIX,
IPTables, and a host of others.
Additionally, I have OSSEC agents running on each of my servers
(including Windoze), which report back to a central OSSEC Server.

Network Intrusion Detection (Snort):
If you are going to use Snort, I highly recommend that you use the
latest version You'll probably have to compile it from source, but it's
worth it. Snort is sending alerts to my central SysLog server, which
provides a nice and easy central logging repository for Snort alerts.
I'm then using OSSEC to monitor the SysLogs for Snort messages, and
generate alert emails.

Rootkit detection and scanning (RKHunter and CHKRootKit [and OSSEC]):
Never trust a single Rootkit scanner. Both RKHunter and CHKRootKit are
excellent tools, but one could have more/different signatures at
different times.

Network Penetration testing (Nessus 3.x):
I can't stress this enough. If you're going to use Nessus
(http://www.nessus.org), do yourself a favor and install the latest
version.

Layer-7 Firewall (ModSecurity / Apache Proxy):
If you're really serious about CISP, spend the $5000 to purchase a
1-year support contract for ModSecurity (Breach Security
http://www.breach.com). In addition to an immense amount of help with
writing custom rules, you also get a really fast ruleset that's
specifically geared towards PCI Compliance.
One caveat, however, is that you should know a good deal about Perl
Regular expressions if you're going to implement ModSecurity. If this is
an issue for you, you may need to look into other (closed-source,
bleck!) alternatives like F5.
Another Firewall solution that I've been playing around with lately is
Untangle (http://www.untangle.com). Unfortunately, I require ethernet
bonding and 802.1q support, so it's not yet a feasable solution for me
yet. That being said, their Snort front-end can't be beat. And I talked
with a couple of the guys at their Linux World booth recently, who said
that they were going to start bundling Untangle with Ubuntu and other
distros (most of which provide the tools and kernel modules for 802.1q
and bonding).

Per machine firewall (IPTables with Shorewall front-end):
Shorewall is extremely powerful, if not a bit difficult to use. I
wouldn't use it for a gateway machine (although I use it as a
router-firewall between networks on my Corporate network), but it makes
a very good Host-based firewall. The idea here is to only leave the
ports that need to be open, open, and only allow access from the
machines/networks that need access to them. You will need other separate
physical firewalls between you and the rest of the world, as well as
between your servers and your database servers, but you can limit who
and what has access to a specific machine.
Secure Central Backups and Archving (Bacula):
I really love Bacula. It's a bit of a learning curve, but it's GPL'ed,
and it runs on multiple platforms. The features of Bacula rival
NetBackup and Legato, although the interface can be cumbersome to use.
The most important feature is Archival encryption. This indemnifies you
against having to report a lost or stolen tape to all of your customers
(which you shouldn't need to worry too much about if you have a good
backup policy).
Of course, you need to have a solid policy for handling tapes that your
employees must adhere to, that a PCI/CISP auditor must sign off on.
Don't be too wordy. All they need to know is: that machines are backed
up on a regular basis, that certain backup sets are retained for XXX
days/years, that you have a compliant offsite archival policy.

Also, if you've never gone through CISP/PCI before, be prepared for a
lot of long nights, headaches, etc. Try not to get discouraged. It will
be worth it in the end. I can honestly say that I am a much better
engineer for having gone through the process.

Monday, October 29, 2007

Windows IT Pro free utilities

My notes from Windows IT Pro September 2007 free utilities:

Open Computer and Software Inventory Next Generation (OCS inventory NG), more than just unix or just Windows servers. I stumbled upon this before, but I never made a note of it until now.

Locate PC will send alerts to an e-mail you specify if IP information changes. Useful for theft, but article author says he gets a few false positives when a laptop drops shortly off WiFi.

SyncBack is the article authors current choise for remote backup, and he mentions he has tried many.

SIW, System Information for Windows, can tell you anything about your system. And supposedly it can recover a lost password. This is worth keeping a note of, probably will come in handy.

Wink screencasting, similiar to screenshot, but with time aspect. I have always been looking for a free alternative to Snag-It, and all I have so far i Printkey 2000 which is oold. Maybe this will help me in right direction!

Windows 2008 notes

Current notes about Windows 2008 status that I found worth noting:

Terminal Services functionality gets much better:

  • Terminal Services Gateway (TSG) which lets you connect to a TSG and from there to other services. This takes away one, not all, reason to use G/On.
  • New kind of RDP over SSL. Newer XP and 2003 RDP clients will be able to use this.
  • Remote Programs, which can be placed on your desktop, running on a remote server over RDP. Takes away one, not all, reason to use Citrix.

Server Core is the ability to install a Windows 2008 server without a gui, or in fact a very limited gui. This is interesting, especially for a Unix server administrator like myself.

  • Server Core does not have .NET.
  • Server Core can not use Windows PowerShell functionality as that is .NET based!
  • Server Core can not yet be bought at a speciel license, it is an option.
  • Server Core can run IIS, but not with .NET.
  • Server Core can run DHCP, DNS, WINS, file and printer server.
  • Server Core can not run Exchange 2007 or SQL server 2005.
  • Server Core is managed from the command window, which means command line.
  • Server Core can be GUI adminstrated with MMC from a full blown Windows 2008.

64 bit is here, get used to it! 64 bit considered default for Windows server 2008 installation!

Active Directory now called Active Directory Domain Services (ADDS), introduces some new features.

  • Read Only Domain Controllers (RODC) and read/write Domain Controllers, insted of all domain contollers of since Windows 2000 and 2003 was read/write.
  • Active Directory snapshot, which you can load, browse and compare. No need to install a seperate Domain Controller.
  • Fine grained password policies for people inside same domain. Try run adsiedit.msc.

In Windows 2008 version of Group Policy Management Console (GPMC) has a Find command for searching the GPOs :-)

Thursday, October 25, 2007

FreeBSD 7 and release documentation readup

FreeBSD 7.0 is on its way and as always I like to read about it a good while before I start testing and doing actual upgrades. Usually I read the release notes, but before that I look at schedules and todo lists on FreeBSD website, and now I can add another website to the list of information:

The FreeBSD Release Documentation Snapshot Page is a great starting point for any "readup", especially while preparing upgrades, or generally info on where FreeBSD releases are going. Also useful if you have some hardware you dont know if works with FreeBSD.

Are you secure? Prove it!

I focus on security and maintainability of the IT services I am involved with. I see many people that do not spend the necessary amount of time on IT service quality assurance of the services they provide, which then raises problems with security and maintainability. Without proper understanding of the services, the IT administration job becomes harder!

Another great blog post on Taosecurity pins much of what I believe in and work from in my everyday job:

Are you secure? Prove it. ... You could expand my five word question into are you operating a process that maintains an acceptable level of perceived risk?

I believe if you can answer yes in the right way, you can often get the bonus of in depth understanding of your IT service, making maintainability a much lesser problem. So the investment into being secure becomes much more than just that!

There was a very interesting reply to the post, one mentioning OSSEC, my current favorite must have system for IT adminstrator, regardless it being Windows, FreeBSD or Linux:

How would you go about performing #7 without some type of SEM? Ideally, you would combine SEM with NSM, which is what I plan on doing. Any suggestions? I've read through several of your posts regarding CS-MARS, etc. and I can understand how SEMs don't give you enough information to act upon alerts as they are alert-centric and usually don't provide you with session data or full content data, but at least they can point you in the right direction of further investigation. They provide you with what Daniel from the OSSEC project calls a LIDS (log-based intrusion detection system) and then do the job of correlating them from numerous devices. So how would you do the above (#7) without some sort of SEM?

SEM = Security Event Management. HTH

My answer to the above would of course be to combine SEM and NMS. I would not rely only on one system, I am using a combination of the following: NSM/IDS and HIM/LIDS/SEM.

Here is the complete list from the post, it is just awesome reading. And with some positive talking for selling the NSM idea, which I am all for!

So, are you secure? Prove it.

  1. Yes. Then, crickets (i.e., silence for you non-imaginative folks.) This is completely unacceptable. The failure to provide any kind of proof is security by belief. We want security by fact.

  2. Yes, we have product X, Y, Z, etc. deployed. This is better, but it's another expression of belief and not fact. The only fact here is that technologies can be abused, subverted, and broken. Technologies can be simultaneously effective against one attack model and completely worthless against another.

  3. Yes, we are compliant with regulation X. Regulatory compliance is usually a check-box paperwork exercise whose controls lag attack models of the day by one to five years, if not more. A compliant enterprise is like feeling an ocean liner is secure because it left dry dock with life boats and jackets. If regulatory compliance is more than a paperwork self-survey, we approach the realm of real of evidence. However, I have not seen any compliance assessments which measure anything of operational relevance.

  4. Yes, we have logs indicating we prevented attacks X, Y, and Z. This is getting close to the right answer, but it's still inadequate. For the first time we have some real evidence (logs) but these will probably not provide the whole picture. Sure, logs indicate what was stopped, but what about activities that were allowed? Were they all normal, or were some malicious but unrecognized by the preventative mechanism?

  5. Yes, we do not have any indications that our systems are acting outside their expected usage patterns. Some would call this rationale the definition of security. Whether or not this answer is acceptable depends on the nature of the indications. If you have no indications because you are not monitoring anything, then this excuse is hollow. If you have no indications and you comprehensively track the state of an asset, then we are making real progress. That leads to the penultimate answer, which is very close to ideal.

  6. Yes, we do not have any indications that our systems are acting outside their expected usage patterns, and we thoroughly collect, analyze, and escalate a variety of network-, host-, and memory-based evidence for signs of violations. This is really close to the correct answer. The absence of indications of intrusion is only significant if you have some assurance that you've properly instrumented and understood the asset. You must have trustworthy monitoring systems in order to trust that an asset is "secure." If this is really close, why isn't it correct?

  7. Yes, we do not have any indications that our systems are acting outside their expected usage patterns, and we thoroughly collect, analyze, and escalate a variety of network-, host-, and memory-based evidence for signs of violations. We regularly test our detection and response people, processes, and tools against external adversary simulations that match or exceed the capabilities and intentions of the parties attacking our enterprise (e.g., the threat). Here you see the reason why number 6 was insufficient. If you assumed that number 6 was ok, you forgot to ensure that your operations were up to the task of detecting and responding to intrusions. Periodically you must benchmark your perceived effectiveness against a neutral third party in an operational exercise (a "red team" event). A final assumption inherent in all seven answers is that you know the assets you are trying to secure, which is no mean feat.


Incidentally, this post explains why deploying a so-called IPS does nothing for ensuring "security." Of course, you can demonstrate that it blocked attacks X, Y, and Z. But, how can you be sure it didn't miss something?

If you want to spend the least amount of money to take the biggest step towards Magnificent Number 7, you should implement Network Security Monitoring.

Thursday, October 18, 2007

Advanced Batch File Techniques

Many Windows administrators keep using batch for small scripts and utils. It is a good idea in the right use!

Batch is good to keep things simple, and the majority of your collegues most likely will feel safe when its in batch, compared to perl, vbs, php, sh or even c/c++ utils. Of course there are people who are so good at programming that they can read code, but most likely they are not your avarage Windows administrator collegue.

I found an advanced batch example, which uses functions in batch, and also writes to files, reads from files and cleanup of course. Its good for some batch education. Also check out the improved version, which does not use files. Impressive :-)

For my own little usage today, i wanted to get the drive from a string, so i can switch the that drive and directory, and at the same time replace / to \ so mkdir and cd works. Looking at the all round really cool scripting pages from Rob van der Woude, and his awesome batch examples, I came up with the test script which does the job:

@echo off
SET STRING=c:/some/dir
ECHO Original string: %STRING%
SET STRING=%STRING:/=\%
SET FIXEDSTRING=%STRING%
ECHO Fixed string: %FIXEDSTRING%

FOR /F "tokens=1 delims=\ " %%A IN ('echo %string%') DO SET drive=%%A
ECHO we are on %drive%

SET STRING=%STRING:~0,1%
ECHO or another way, we are on %drive%

SET STRING=%FIXEDSTRING:~2,9999%
ECHO Dir string: %STRING%

What remains would be how to suck in a config file, or a file with lines of strings that I want to manipulate (eg. a list of filenames). How can I do this? :-) Some places to look could be:

http://home.pcisys.net/~sungstad/useful/PCbatch.html

http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/for.mspx?mfr=true

http://www.maem.umr.edu/batch/stack1.html

UPDATE 1:

My first example broke if STRING had quotations, so instead I came up with this to get drive and dirname, and i also stripped quotes:

REM Find the drive of %localroot%:
FOR /F "tokens=1 delims=\ " %%A IN ('echo %localroot%') DO SET localrootdrive=%%~dA

REM Remove surrounding quotation marks:
FOR /F "delims=" %%A IN ('echo %localroot%') DO SET localroot=%%~A
FOR /F "delims=" %%A IN ('echo %vsspath%') DO SET vsspath=%%~A

REM Now get directory:
set localrootdir=%localroot:~2%

REM Strip any leading /
IF "%vsspath:~0,1%"=="/" SET vsspath=%vsspath:~1%
REM Replace / to \ in VSS path:
set vsspathwin=%vsspath:/=\%

REM Create a tmpfile we can use (this is from MKSNT, not Windows):
FOR /F "usebackq" %%A IN (`tempfile`) DO set envtmpfile=%%A
set envtmpfile=%envtmpfile:/=\%
if exist %envtmpfile% del %envtmpfile%

REM Exit if there was an error earlier, eg. like:
IF ERRORLEVEL 1 set error=true
if "%error%"=="true" exit /b 1
exit /b 0

Some for loop notes:
for /l %a in (start increment final) do
for /l %i in (4 2 10) do echo %i
To work with all lines from a .txtfile or output from a command use:
for /f %i in (c:\file1.txt c:\file2.txt) do echo %i
for /f %i in ('dir /ad /b \\server\share\files*') do echo %i

Remember to use %%i in scripts, and not just %i.

On a side note, do remember that environment variables can be usefull in your batch scripts. Set var=something and set var= to unset. And access them from inside your scripts as you would normal variables:
if defined SOME_ENV_VAR goto somewhere
start some.exe

UPDATE 2:

Turns out running call /? gives you much of the cool information i have missed during the years:

call /?
Calls one batch program from another.

CALL [drive:][path]filename [batch-parameters]

batch-parameters Specifies any command-line information required by the
batch program.

If Command Extensions are enabled CALL changes as follows:

CALL command now accepts labels as the target of the CALL. The syntax
is:

CALL :label arguments

A new batch file context is created with the specified arguments and
control is passed to the statement after the label specified. You must
"exit" twice by reaching the end of the batch script file twice. The
first time you read the end, control will return to just after the CALL
statement. The second time will exit the batch script. Type GOTO /?
for a description of the GOTO :EOF extension that will allow you to
"return" from a batch script.

In addition, expansion of batch script argument references (%0, %1,
etc.) have been changed as follows:


%* in a batch script refers to all the arguments (e.g. %1 %2 %3
%4 %5 ...)

Substitution of batch parameters (%n) has been enhanced. You can
now use the following optional syntax:

%~1 - expands %1 removing any surrounding quotes (")
%~f1 - expands %1 to a fully qualified path name
%~d1 - expands %1 to a drive letter only
%~p1 - expands %1 to a path only
%~n1 - expands %1 to a file name only
%~x1 - expands %1 to a file extension only
%~s1 - expanded path contains short names only
%~a1 - expands %1 to file attributes
%~t1 - expands %1 to date/time of file
%~z1 - expands %1 to size of file
%~$PATH:1 - searches the directories listed in the PATH
environment variable and expands %1 to the fully
qualified name of the first one found. If the
environment variable name is not defined or the
file is not found by the search, then this
modifier expands to the empty string

The modifiers can be combined to get compound results:

%~dp1 - expands %1 to a drive letter and path only
%~nx1 - expands %1 to a file name and extension only
%~dp$PATH:1 - searches the directories listed in the PATH
environment variable for %1 and expands to the
drive letter and path of the first one found.
%~ftza1 - expands %1 to a DIR like output line

In the above examples %1 and PATH can be replaced by other
valid values. The %~ syntax is terminated by a valid argument
number. The %~ modifiers may not be used with %*

A good explanation of the FOR LOOP possibilites can be found here [http://www.computerhope.com/forhlp.htm]:

eol=c
specifies an end of line comment character (just one)

skip=n
specifies the number of lines to skip at the beginning of the
file.

delims=xxx
specifies a delimiter set. This replaces the default delimiter
set of space and tab.

tokens=x,y,m-n
specifies which tokens from each line are to be passed to
the for body for each iteration. This will cause additional variable names to be
allocated. The m-n form is a range, specifying the mth through the nth tokens.
Ifthe last character in the tokens= string is an asterisk, then an additional
variable is allocated and receives the remaining text on the line after the last
token parsed.

usebackq
specifies that the new semantics are in force, where a back
quoted string is executed as a command and a single quoted string is a literal
string command and allows the use of double quotes to quote file names in
filenameset.


Wednesday, October 17, 2007

PC Decrapifier, for Windows

Before I started this blog I read ten free security tools. Just a few days ago, I wanted to use the decrapifier that was mentioned, but I had forgot the name. So now its here for future ease of use:
The PC Decrapifier does exactly that -- removes crapware that comes pre-installed on Windows computers.

This program will not remove crapware from older computers but is perfect for new machines that ships with trialware.

There is a long list of products it will find and remove, including QuickBooks Trial, NetZero Installers, Earthlink Setup Files, Google Desktop and the myriad of anti-virus trialware apps.

Others from the list worth mentioning:
File Shredder is a must-have privacy tool that wipes/destroys documents beyond recovery.
GMER, a free rootkit scanning tool built by Polish Windows internals guru, is widely hailed as the best at ferreting out stealth rootkits from PCs.

America Online's Active Virus Shield, powered by Kaspersky Lab, is one of the better free anti-virus packages available for Windows users.

OpenDNS is a must-have free service (there's no software to install) that speeds up Web surfing, corrects domain typos on the fly and protects you from phishing scams.

All you do is change your DNS settings (instructions here) to the OpenDNS servers: 208.67.222.222 and 208.67.220.220


Trying OmniFind Yahoo Search

When looking for Windows search util solutions, I stumbled upon OmniFind, which seemed too good to be true:
Install it in 3 clicks, configure it in minutes.
Free, searches up to 500,000 documents.
Search both the enterprise and the Internet from a single interface.
Incorporates open source Apache Lucene technology to deliver the best of community innovation with IBM's enterprise features.
But OmniFind was exactly like that! Downloading, installing, configuring, testing indexing a website and a filesystem location, all done in 15 minutes!

The server OS requirements are not my favorite, but for the enterprise it makes sense, and expected when it comes to IBM. Their favorites are of course Redhat and Suse. Too bad for me, my favorite Linux being Debian, and of course i always vouch for FreeBSD.

32-bit Red Hat Enterprise LinuxVersion 4, Update 3
32-bit SUSE Linux Enterprise 10
32-bit Windows XP SP2
32-bit Windows 2003 Server SP1

Some notes from the testing so far:

Indexing filesystems, with .doc, .xls, works like a charm, and the search results can be browsed "as html" and "cached". Very useful!

OmniFind installs as its own webservice, on a port of your choice. I changed the search page appearance with company logo and disabled all the Yahoo links. All very simple from the OmniFind admin control panel!

Searching for a string inside any word, you should add a wildcard. For example you should search "regression*" to make sure you locate occurrancies of "regressions".

Reindexing seems to be something you have to wrap into your own scripts, and schedule them, eg. with at jobs.

You can use scripts to start or stop a crawler.
Crawler management scripts allow you to schedule and execute start and stop crawler actions, or start and stop a crawler from the command line.
Cleaning the index for documents that should not be crawled is not so friendly. It seems you have to delete the entire source, eg. website, then crawl it again. It can be tiresome if it is a big website.

The language pack should be installed before you start crawling your big sources, as you will have to do it all over again when then language pack has been installed.

Crawling protected websites was possible, i have tested https:// protected by basic authentication, it worked fine. Crawling formbased authentication, as a company portal document handling system, should also be possible:

HTML form-based authentication
Form name (optional)
Example: loginPage
Form action
Example: http://www.example.org/
authentication/login.do
HTTP method: POST or GET
Example: POST
Form parameters (optional)
Example: userid and myuserID


So far, I am very pleased with OmniFind, I recommend everyone give it a try. OmniFind might be the single point of entry for knowledge search that your organization need to bring knowledge from many sources to life and use!!