Monday, October 29, 2007

Windows IT Pro free utilities

My notes from Windows IT Pro September 2007 free utilities:

Open Computer and Software Inventory Next Generation (OCS inventory NG), more than just unix or just Windows servers. I stumbled upon this before, but I never made a note of it until now.

Locate PC will send alerts to an e-mail you specify if IP information changes. Useful for theft, but article author says he gets a few false positives when a laptop drops shortly off WiFi.

SyncBack is the article authors current choise for remote backup, and he mentions he has tried many.

SIW, System Information for Windows, can tell you anything about your system. And supposedly it can recover a lost password. This is worth keeping a note of, probably will come in handy.

Wink screencasting, similiar to screenshot, but with time aspect. I have always been looking for a free alternative to Snag-It, and all I have so far i Printkey 2000 which is oold. Maybe this will help me in right direction!

Windows 2008 notes

Current notes about Windows 2008 status that I found worth noting:

Terminal Services functionality gets much better:

  • Terminal Services Gateway (TSG) which lets you connect to a TSG and from there to other services. This takes away one, not all, reason to use G/On.
  • New kind of RDP over SSL. Newer XP and 2003 RDP clients will be able to use this.
  • Remote Programs, which can be placed on your desktop, running on a remote server over RDP. Takes away one, not all, reason to use Citrix.

Server Core is the ability to install a Windows 2008 server without a gui, or in fact a very limited gui. This is interesting, especially for a Unix server administrator like myself.

  • Server Core does not have .NET.
  • Server Core can not use Windows PowerShell functionality as that is .NET based!
  • Server Core can not yet be bought at a speciel license, it is an option.
  • Server Core can run IIS, but not with .NET.
  • Server Core can run DHCP, DNS, WINS, file and printer server.
  • Server Core can not run Exchange 2007 or SQL server 2005.
  • Server Core is managed from the command window, which means command line.
  • Server Core can be GUI adminstrated with MMC from a full blown Windows 2008.

64 bit is here, get used to it! 64 bit considered default for Windows server 2008 installation!

Active Directory now called Active Directory Domain Services (ADDS), introduces some new features.

  • Read Only Domain Controllers (RODC) and read/write Domain Controllers, insted of all domain contollers of since Windows 2000 and 2003 was read/write.
  • Active Directory snapshot, which you can load, browse and compare. No need to install a seperate Domain Controller.
  • Fine grained password policies for people inside same domain. Try run adsiedit.msc.

In Windows 2008 version of Group Policy Management Console (GPMC) has a Find command for searching the GPOs :-)

Thursday, October 25, 2007

FreeBSD 7 and release documentation readup

FreeBSD 7.0 is on its way and as always I like to read about it a good while before I start testing and doing actual upgrades. Usually I read the release notes, but before that I look at schedules and todo lists on FreeBSD website, and now I can add another website to the list of information:

The FreeBSD Release Documentation Snapshot Page is a great starting point for any "readup", especially while preparing upgrades, or generally info on where FreeBSD releases are going. Also useful if you have some hardware you dont know if works with FreeBSD.

Are you secure? Prove it!

I focus on security and maintainability of the IT services I am involved with. I see many people that do not spend the necessary amount of time on IT service quality assurance of the services they provide, which then raises problems with security and maintainability. Without proper understanding of the services, the IT administration job becomes harder!

Another great blog post on Taosecurity pins much of what I believe in and work from in my everyday job:

Are you secure? Prove it. ... You could expand my five word question into are you operating a process that maintains an acceptable level of perceived risk?

I believe if you can answer yes in the right way, you can often get the bonus of in depth understanding of your IT service, making maintainability a much lesser problem. So the investment into being secure becomes much more than just that!

There was a very interesting reply to the post, one mentioning OSSEC, my current favorite must have system for IT adminstrator, regardless it being Windows, FreeBSD or Linux:

How would you go about performing #7 without some type of SEM? Ideally, you would combine SEM with NSM, which is what I plan on doing. Any suggestions? I've read through several of your posts regarding CS-MARS, etc. and I can understand how SEMs don't give you enough information to act upon alerts as they are alert-centric and usually don't provide you with session data or full content data, but at least they can point you in the right direction of further investigation. They provide you with what Daniel from the OSSEC project calls a LIDS (log-based intrusion detection system) and then do the job of correlating them from numerous devices. So how would you do the above (#7) without some sort of SEM?

SEM = Security Event Management. HTH

My answer to the above would of course be to combine SEM and NMS. I would not rely only on one system, I am using a combination of the following: NSM/IDS and HIM/LIDS/SEM.

Here is the complete list from the post, it is just awesome reading. And with some positive talking for selling the NSM idea, which I am all for!

So, are you secure? Prove it.

  1. Yes. Then, crickets (i.e., silence for you non-imaginative folks.) This is completely unacceptable. The failure to provide any kind of proof is security by belief. We want security by fact.

  2. Yes, we have product X, Y, Z, etc. deployed. This is better, but it's another expression of belief and not fact. The only fact here is that technologies can be abused, subverted, and broken. Technologies can be simultaneously effective against one attack model and completely worthless against another.

  3. Yes, we are compliant with regulation X. Regulatory compliance is usually a check-box paperwork exercise whose controls lag attack models of the day by one to five years, if not more. A compliant enterprise is like feeling an ocean liner is secure because it left dry dock with life boats and jackets. If regulatory compliance is more than a paperwork self-survey, we approach the realm of real of evidence. However, I have not seen any compliance assessments which measure anything of operational relevance.

  4. Yes, we have logs indicating we prevented attacks X, Y, and Z. This is getting close to the right answer, but it's still inadequate. For the first time we have some real evidence (logs) but these will probably not provide the whole picture. Sure, logs indicate what was stopped, but what about activities that were allowed? Were they all normal, or were some malicious but unrecognized by the preventative mechanism?

  5. Yes, we do not have any indications that our systems are acting outside their expected usage patterns. Some would call this rationale the definition of security. Whether or not this answer is acceptable depends on the nature of the indications. If you have no indications because you are not monitoring anything, then this excuse is hollow. If you have no indications and you comprehensively track the state of an asset, then we are making real progress. That leads to the penultimate answer, which is very close to ideal.

  6. Yes, we do not have any indications that our systems are acting outside their expected usage patterns, and we thoroughly collect, analyze, and escalate a variety of network-, host-, and memory-based evidence for signs of violations. This is really close to the correct answer. The absence of indications of intrusion is only significant if you have some assurance that you've properly instrumented and understood the asset. You must have trustworthy monitoring systems in order to trust that an asset is "secure." If this is really close, why isn't it correct?

  7. Yes, we do not have any indications that our systems are acting outside their expected usage patterns, and we thoroughly collect, analyze, and escalate a variety of network-, host-, and memory-based evidence for signs of violations. We regularly test our detection and response people, processes, and tools against external adversary simulations that match or exceed the capabilities and intentions of the parties attacking our enterprise (e.g., the threat). Here you see the reason why number 6 was insufficient. If you assumed that number 6 was ok, you forgot to ensure that your operations were up to the task of detecting and responding to intrusions. Periodically you must benchmark your perceived effectiveness against a neutral third party in an operational exercise (a "red team" event). A final assumption inherent in all seven answers is that you know the assets you are trying to secure, which is no mean feat.


Incidentally, this post explains why deploying a so-called IPS does nothing for ensuring "security." Of course, you can demonstrate that it blocked attacks X, Y, and Z. But, how can you be sure it didn't miss something?

If you want to spend the least amount of money to take the biggest step towards Magnificent Number 7, you should implement Network Security Monitoring.

Thursday, October 18, 2007

Advanced Batch File Techniques

Many Windows administrators keep using batch for small scripts and utils. It is a good idea in the right use!

Batch is good to keep things simple, and the majority of your collegues most likely will feel safe when its in batch, compared to perl, vbs, php, sh or even c/c++ utils. Of course there are people who are so good at programming that they can read code, but most likely they are not your avarage Windows administrator collegue.

I found an advanced batch example, which uses functions in batch, and also writes to files, reads from files and cleanup of course. Its good for some batch education. Also check out the improved version, which does not use files. Impressive :-)

For my own little usage today, i wanted to get the drive from a string, so i can switch the that drive and directory, and at the same time replace / to \ so mkdir and cd works. Looking at the all round really cool scripting pages from Rob van der Woude, and his awesome batch examples, I came up with the test script which does the job:

@echo off
SET STRING=c:/some/dir
ECHO Original string: %STRING%
SET STRING=%STRING:/=\%
SET FIXEDSTRING=%STRING%
ECHO Fixed string: %FIXEDSTRING%

FOR /F "tokens=1 delims=\ " %%A IN ('echo %string%') DO SET drive=%%A
ECHO we are on %drive%

SET STRING=%STRING:~0,1%
ECHO or another way, we are on %drive%

SET STRING=%FIXEDSTRING:~2,9999%
ECHO Dir string: %STRING%

What remains would be how to suck in a config file, or a file with lines of strings that I want to manipulate (eg. a list of filenames). How can I do this? :-) Some places to look could be:

http://home.pcisys.net/~sungstad/useful/PCbatch.html

http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/for.mspx?mfr=true

http://www.maem.umr.edu/batch/stack1.html

UPDATE 1:

My first example broke if STRING had quotations, so instead I came up with this to get drive and dirname, and i also stripped quotes:

REM Find the drive of %localroot%:
FOR /F "tokens=1 delims=\ " %%A IN ('echo %localroot%') DO SET localrootdrive=%%~dA

REM Remove surrounding quotation marks:
FOR /F "delims=" %%A IN ('echo %localroot%') DO SET localroot=%%~A
FOR /F "delims=" %%A IN ('echo %vsspath%') DO SET vsspath=%%~A

REM Now get directory:
set localrootdir=%localroot:~2%

REM Strip any leading /
IF "%vsspath:~0,1%"=="/" SET vsspath=%vsspath:~1%
REM Replace / to \ in VSS path:
set vsspathwin=%vsspath:/=\%

REM Create a tmpfile we can use (this is from MKSNT, not Windows):
FOR /F "usebackq" %%A IN (`tempfile`) DO set envtmpfile=%%A
set envtmpfile=%envtmpfile:/=\%
if exist %envtmpfile% del %envtmpfile%

REM Exit if there was an error earlier, eg. like:
IF ERRORLEVEL 1 set error=true
if "%error%"=="true" exit /b 1
exit /b 0

Some for loop notes:
for /l %a in (start increment final) do
for /l %i in (4 2 10) do echo %i
To work with all lines from a .txtfile or output from a command use:
for /f %i in (c:\file1.txt c:\file2.txt) do echo %i
for /f %i in ('dir /ad /b \\server\share\files*') do echo %i

Remember to use %%i in scripts, and not just %i.

On a side note, do remember that environment variables can be usefull in your batch scripts. Set var=something and set var= to unset. And access them from inside your scripts as you would normal variables:
if defined SOME_ENV_VAR goto somewhere
start some.exe

UPDATE 2:

Turns out running call /? gives you much of the cool information i have missed during the years:

call /?
Calls one batch program from another.

CALL [drive:][path]filename [batch-parameters]

batch-parameters Specifies any command-line information required by the
batch program.

If Command Extensions are enabled CALL changes as follows:

CALL command now accepts labels as the target of the CALL. The syntax
is:

CALL :label arguments

A new batch file context is created with the specified arguments and
control is passed to the statement after the label specified. You must
"exit" twice by reaching the end of the batch script file twice. The
first time you read the end, control will return to just after the CALL
statement. The second time will exit the batch script. Type GOTO /?
for a description of the GOTO :EOF extension that will allow you to
"return" from a batch script.

In addition, expansion of batch script argument references (%0, %1,
etc.) have been changed as follows:


%* in a batch script refers to all the arguments (e.g. %1 %2 %3
%4 %5 ...)

Substitution of batch parameters (%n) has been enhanced. You can
now use the following optional syntax:

%~1 - expands %1 removing any surrounding quotes (")
%~f1 - expands %1 to a fully qualified path name
%~d1 - expands %1 to a drive letter only
%~p1 - expands %1 to a path only
%~n1 - expands %1 to a file name only
%~x1 - expands %1 to a file extension only
%~s1 - expanded path contains short names only
%~a1 - expands %1 to file attributes
%~t1 - expands %1 to date/time of file
%~z1 - expands %1 to size of file
%~$PATH:1 - searches the directories listed in the PATH
environment variable and expands %1 to the fully
qualified name of the first one found. If the
environment variable name is not defined or the
file is not found by the search, then this
modifier expands to the empty string

The modifiers can be combined to get compound results:

%~dp1 - expands %1 to a drive letter and path only
%~nx1 - expands %1 to a file name and extension only
%~dp$PATH:1 - searches the directories listed in the PATH
environment variable for %1 and expands to the
drive letter and path of the first one found.
%~ftza1 - expands %1 to a DIR like output line

In the above examples %1 and PATH can be replaced by other
valid values. The %~ syntax is terminated by a valid argument
number. The %~ modifiers may not be used with %*

A good explanation of the FOR LOOP possibilites can be found here [http://www.computerhope.com/forhlp.htm]:

eol=c
specifies an end of line comment character (just one)

skip=n
specifies the number of lines to skip at the beginning of the
file.

delims=xxx
specifies a delimiter set. This replaces the default delimiter
set of space and tab.

tokens=x,y,m-n
specifies which tokens from each line are to be passed to
the for body for each iteration. This will cause additional variable names to be
allocated. The m-n form is a range, specifying the mth through the nth tokens.
Ifthe last character in the tokens= string is an asterisk, then an additional
variable is allocated and receives the remaining text on the line after the last
token parsed.

usebackq
specifies that the new semantics are in force, where a back
quoted string is executed as a command and a single quoted string is a literal
string command and allows the use of double quotes to quote file names in
filenameset.


Wednesday, October 17, 2007

PC Decrapifier, for Windows

Before I started this blog I read ten free security tools. Just a few days ago, I wanted to use the decrapifier that was mentioned, but I had forgot the name. So now its here for future ease of use:
The PC Decrapifier does exactly that -- removes crapware that comes pre-installed on Windows computers.

This program will not remove crapware from older computers but is perfect for new machines that ships with trialware.

There is a long list of products it will find and remove, including QuickBooks Trial, NetZero Installers, Earthlink Setup Files, Google Desktop and the myriad of anti-virus trialware apps.

Others from the list worth mentioning:
File Shredder is a must-have privacy tool that wipes/destroys documents beyond recovery.
GMER, a free rootkit scanning tool built by Polish Windows internals guru, is widely hailed as the best at ferreting out stealth rootkits from PCs.

America Online's Active Virus Shield, powered by Kaspersky Lab, is one of the better free anti-virus packages available for Windows users.

OpenDNS is a must-have free service (there's no software to install) that speeds up Web surfing, corrects domain typos on the fly and protects you from phishing scams.

All you do is change your DNS settings (instructions here) to the OpenDNS servers: 208.67.222.222 and 208.67.220.220


Trying OmniFind Yahoo Search

When looking for Windows search util solutions, I stumbled upon OmniFind, which seemed too good to be true:
Install it in 3 clicks, configure it in minutes.
Free, searches up to 500,000 documents.
Search both the enterprise and the Internet from a single interface.
Incorporates open source Apache Lucene technology to deliver the best of community innovation with IBM's enterprise features.
But OmniFind was exactly like that! Downloading, installing, configuring, testing indexing a website and a filesystem location, all done in 15 minutes!

The server OS requirements are not my favorite, but for the enterprise it makes sense, and expected when it comes to IBM. Their favorites are of course Redhat and Suse. Too bad for me, my favorite Linux being Debian, and of course i always vouch for FreeBSD.

32-bit Red Hat Enterprise LinuxVersion 4, Update 3
32-bit SUSE Linux Enterprise 10
32-bit Windows XP SP2
32-bit Windows 2003 Server SP1

Some notes from the testing so far:

Indexing filesystems, with .doc, .xls, works like a charm, and the search results can be browsed "as html" and "cached". Very useful!

OmniFind installs as its own webservice, on a port of your choice. I changed the search page appearance with company logo and disabled all the Yahoo links. All very simple from the OmniFind admin control panel!

Searching for a string inside any word, you should add a wildcard. For example you should search "regression*" to make sure you locate occurrancies of "regressions".

Reindexing seems to be something you have to wrap into your own scripts, and schedule them, eg. with at jobs.

You can use scripts to start or stop a crawler.
Crawler management scripts allow you to schedule and execute start and stop crawler actions, or start and stop a crawler from the command line.
Cleaning the index for documents that should not be crawled is not so friendly. It seems you have to delete the entire source, eg. website, then crawl it again. It can be tiresome if it is a big website.

The language pack should be installed before you start crawling your big sources, as you will have to do it all over again when then language pack has been installed.

Crawling protected websites was possible, i have tested https:// protected by basic authentication, it worked fine. Crawling formbased authentication, as a company portal document handling system, should also be possible:

HTML form-based authentication
Form name (optional)
Example: loginPage
Form action
Example: http://www.example.org/
authentication/login.do
HTTP method: POST or GET
Example: POST
Form parameters (optional)
Example: userid and myuserID


So far, I am very pleased with OmniFind, I recommend everyone give it a try. OmniFind might be the single point of entry for knowledge search that your organization need to bring knowledge from many sources to life and use!!

Tuesday, October 16, 2007

Windows date and regional setting

Different regional settings on Windows servers will cause date command to give different output, which can be annoying if you want to use a date string in your batch scripts.

So I was very happy to see a genious solution:

   @echo off&SETLOCAL

:: This will return date into environment vars
:: Works on any NT/2K/XP machine independent of regional date settings
:: 20 March 2002

FOR /f "tokens=1-4 delims=/-. " %%G IN ('date /t') DO (call :s_fixdate %%G %%H %%I %%J)
goto :s_print_the_date

:s_fixdate
if "%1:~0,1%" GTR "9" shift
FOR /f "skip=1 tokens=2-4 delims=(-)" %%G IN ('echo.^date') DO (
set %%G=%1&set %%H=%2&set %%I=%3)
goto :eof

:s_print_the_date
echo Month:[%mm%] Day:[%dd%] Year:[%yy%]
ENDLOCAL&SET mm=%mm%&SET dd=%dd%&SET yy=%yy%

Intranet and file system search tools on Windows

Recently I have looked into challenges and requirements for search tools for knowledge management. In my testing, I have been focussing on tools that could run off a Unix box, indexing serveral sources of information. Testing those tools are still undergoing.

Now I have another use for search tools, this time running off a Windows server. Requirements for eg. what sources to index are the same as the Unix tools still being tested.

Using the very good searchtools.com website, I found some interesting tools:
  • Mnogosearch Windows
  • Zoom search engine
  • Apache Solr
  • OnmiFind
So far I have setted up the Mnogosearch for Windows MSSQL with SQL Express 2005, but I still have to setup search integration into IIS. I have stalled this test, mainly because of the price! It is so very expensive, I could almost get a GSA mini instead. For testing the trial version indexing 1 kb of data from each file is okay, but its just too expensive to put more work into. Add to that, it seems that the Windows version is falling behind in releases, does not seem to be maintained very much.

I have not tested Apache Lucene Solr yet. It can become hard to test for me, as it is Java based, and I dont have a ready to run test environment for such testing. Reading on Solr, it should be able to index intranet, hopefully shared drives too, but i have to look at it!

OmniFind, like Solr, is based on Lucene, but seems like a better package for me to test. It is free, can index file system and sounds too good to be true:
  • Install it in 3 clicks, configure it in minutes.
  • Searches up to 500,000 documents.
  • Search both the enterprise and the Internet from a single interface.
  • Incorporates open source Apache Lucene technology to deliver the best of community innovation with IBM's enterprise features.

I have installed the Zoom search engine on my laptop, indexing the directory with some .doc, .txt, .cmd etc files, putting the result search page to an IIS webserver! Simple and working! In the free version Zoom will only index static files, and a max of 50 documents. This is annoying, I would rather have full version in eg. 30 days! Notes so far:

  • Cheap, $99 for pro, $299 for enterprise use.
  • Very easy setup
  • Search does not trigger documents which have the searched word in filename!
  • Can reindexing be automated?

Search tools, challenges and non-trivial requirements

I have listed some key challenges for my current usage of search tools:

  • Create a point of entry for search.
  • Link to relevant search query from a portal (eg. a operation status website).
  • Some knowledge should only be available to some people. This seems to the biggest hurdle!

Limiting knowledge/search only to some people could be solved in at least 2 ways:

  1. Set up different indexer/crawler configurations, each searchable from different search prompt. Problem could be multiple crawls of the same info (load, storage, ressources)
  2. Index/crawl everything once, and let the search box/website/frontend control who can see what. This would be preferred.

Listing non-trivial requirements which are not always availble:

  • Parse open office word and calc, (.odt and .ods), which is basically zipfiles with xml (unzip and parse eg. content.xml).
  • Crawling/indexing file sytems (shares/harddrives), setting a baseurl for how the searchresults will become browsable.
  • Reindexing must automated, eg. scheduled or cron'd.

Wednesday, October 10, 2007

Windows package management

I love apt-get that is in Debian, it is my favorite Linux package management system, so I was happy when I stumbled upon Win-get:

win-get is an automated install system and software repository for Microsoft Windows written in pascal (for the command line client) and php for the online repository. The ideas for its creation come from apt-get and other related tools for the *nix platforms.

Recently I am not spending too much time on my Windows client installations, so I probably wont try Win-get, and stick with manually updating.

If I had more Windows clients to maintain, I most certainly would make them logon to a Samba Windows domain and use WPKG for package management.

Sunday, October 7, 2007

FreeBSD system beep and dual monitor setup

I am using PCBSD on my R60 laptop, and turning off system beep was even easier than using kbdcontrol -b off or sysctl.

I simply used the Bell Settings from the KDE menu, and setted volume to 0, simple and effective! The sound system still works like a charm.

I have giving up on setting up dual monitor, with our TV, as everything I found so far points toward a lot of xorg.conf tweaking, something I just dont want to spend my time on!
I have decided that my next laptop must have dual output, eg. a VGA or DVI, and then I just might give it a shot.

I still hope to find a way to change back and forth with the R60 VGA output from laptop to TV. That would be nice to have!

Friday, October 5, 2007

Knoppix-NSM

I am still playing around with my FreeBSD laptop, trying to tune it for NSM (snort, sguil etc) and penetration testing (nessus 2 and 3), but only moving slowly forward.

So when I stumbled uppon Knoppix-NSM in a NSM and Sguil article, I thought: why spend all this time tuning my laptop if I can boot a liveCD and be running?

Well, I havnt tried the Knoppix-NSM LiveCD, because when I think about it, I enjoy learning while playing and tuning with setting up the FreeBSD laptop for what I need. And I can live with the delay, as I dont have anything I must scan or NSM right at this moment! And perhaps daily work will be ore automatic when I am done, which is also a major concern.

Comparing Office documents

I found a website with good overview of different diff utils, it has utils i already know, in addition to many others, unfortunately it does not seem like there are any utils which can be customized like i really want.

Even though most of the tools are Windows based and not open source or freeware, I simply have to give a few trials a try, let alone for the following possibilities:
  • Custom file filters
  • Command line: supported
  • Plug-ins for: data files (CSV), image formats, exe/dll version information, mp3 files, icon/cursor files, MS Office and others

Snips from the diffutils.com reviews:

Beyond Compare 2 is a great software solution for almost any revision control project. It has powerful merge and synchronization functionality. However, Beyond Compare 2 is not completely suitable for the comparison of MS Excel and Word files. If your comparison project includes largely MS Office documents, we advise you to use Compare Suite.

Our rating for Compare Suite is 8/10. However, it is our Editor’s choice as one of the best software applications for office document comparison. It also has powerful capabilities for integrating with document management system. In conclusion, we definitely recommend Compare Suite as the optimal choice for document comparison.

Take a look at the full utils list on http://www.diffutils.com/list-of-reviewed-software.

Defining crucial changes for text files

Sometimes I wish to be able to define what I think is a crucial change for a text file, instead of just every diff from one version to the next which we have plenty of tools for.

I need more than just an average diff util to see what has actually changed from one report output to the next, in order to avoid false positive line matches.

Some of the problems with standard diff util is that it can not handle these cases:

  • The order of rows has changed, but the content has not changed. (Moved lines detection in file compare)
  • The offset of columns has changed, but the content has not changed.
  • Whitespace, tabs and or spaces could be ignored.
  • Data in a line has changed, but is ok to ignore, such as date changes.
  • More or less data in a certain section has changed, but can be ignored.
  • Tags order changes, but the content within a tag does not. Eg. HTML tags.

Some examples of when a more advanced diff util could come in handy is:

  • Nessus .nsr scan result files, looking for interesting changes.
  • WYSIWYG HTML editors saves tags in another way that when file was loaded, even if there was no changes.

One approach is to make a configuration file for the diff util so you can use it in as many places as possible. Is this referred to as custom file filters.

Commandline is required for scripted compare.

Unix version is almost a must. Because often it is output from unix boxes that will be compared!

All this, instead of writing a custom parser for diff everytime a new usage comes up :-)

Thursday, October 4, 2007

Taskmanager - administrator mode

It turns out you can get a taskmanager with administrative rights, without being administrator!

What you do is:
  1. Add yourself to administrator group.
  2. Start taskmanager and click [v] Show processes from all users.
  3. Stop taskmanager and remove yourself from administrator group.
  4. Start taskmanager as before, it will now show processes from all users! You can not change the [ ] Show processes from all users, as expected, but it acts like it is [v] :-)

This is useful if you want to get eg. a taskmanager with admin rights on a Citrix server and only have one taskmgr.exe published. Repeat the steps above for each user that needs the show processes from all users!

Supposedly this must be a Windows bug, but I didnt find anything about it.

Cyber Security Awareness Month

Over at the SANS Internet Storm Senter (ISC), there was an article describing how they will put focus on the october Cyber Security Awareness Month - Daily Topics:

October is Cyber Security Awareness Month and the Internet Storm Center is going to focus on one security awareness subject per day. We plan to provide useful information for information security professionals who want to educate their users but do not have a ready set of awareness tips.

I will keep an eye on the topics, and keep my diary updated with interesting snips as always :-)

Tuesday, October 2, 2007

Unix utils on your Windows box, eg. quick cleanup of dirs

With MKS Toolkit or GnuWin32 on a Windows box you can reuse most of your unix oneliners. Very handy for simple administration. There are other unix tools for Windows, but it seems GnuWin32 is very active!

Here is an example, how to cleanup .txt files created more than 35 days ago:

sh -c "find \\\\host\\sharename -name \"*.txt\" -ctime +35 -exec rm \"{}\" \";\""

That is a nice quick way to delete files older than a specific date, it can easily be modified to do more complex stuff.

More complex sample:

sh -c "find \\\\host\\sharename -name \"*.log\" -depth -mtime +0 -exec echo \"{}\" %date% \";\" grep -v \"renamed\" awk '{print \"mv \"$1\" \"$1$2$3\"renamed.log\"}' sh "


EDIT1:

The %date% syntax is really useful in batch scripts, eg:

echo %time%

11:02:54,16


echo %date%

03-10-2007


echo %date:~6%%date:~3,2%%date:~0,2%-%time:~0,2%%time:~3,2%%time:~6,2%

20071003-110300


The end.