Tuesday, February 19, 2008

More options for third party software updates

Not long ago I mentioned the Secunia PSI (Personal Software Inspector) as a mean to update your third party software on Windows.

Now I noticed that SANS ISC has a nice article with some more recommendations:
Other options are UpdateStar (Windows), SUMo - Software Update Monitor (Windows), VersionTracker [Pro] (Mac and Windows), RadarSync (Windows), UDC - UpdateChecker (Windows), Belarc Advisor (Windows), and App Update Widget (Mac).
I have not tried any of them yet :-)

The same day they had a really good point, about something that often bothers me on Windows and Mac:
Unprivileged user vs. Administrator: A few third-party Windows software do not show the availability of new updates unless you are running as Administrator.
...
Therefore, the conclusion is that you need to periodically (every day?) login as (or run things as) Administrator to perform periodic tests for new updates. Obviously, this is not practical for end users, so we clearly need to improve the third-party update mechanisms in Windows to be accurate, up-to-date and work smoothly from non-privileged accounts.

Tuesday, February 12, 2008

Xcacls.vbs directories only and column output truncated

As i mentioned earlier the xcacls.vbs output is truncated so the information is not fully presented, eg. usernames are cut at 24 characters. This got very annoying, so I was happy to find a solution:
Edit xcacls.vbs line 593, Call PrintMsg( strPackString...
Edit xcacls.vbs line 614, Call AddStringToArray(arraystrACLS,

I changed the two lines to:
Call PrintMsg( strPackString("Type", 8, 1, TRUE) & strPackString("Username", 50, 1, TRUE) & strPackString("Permissions", 42, 1, TRUE) & strPackString("Inheritance", 35, 1, TRUE)) For Each objDACL_Member in objSecDescriptor.DACL

Call AddStringToArray(arraystrACLS, strPackString(strAceType, 8, 1, TRUE) & strPackString(objtrustee.Domain & "\" & objtrustee.Name, 50, 1, TRUE) & strPackString(TempSECString, 42, 1, TRUE) & strPackString(strAceFlags, 35, 1, TRUE),-1) Set objtrustee = Nothing

Now the output is more useful.

The next problem is that I can not get Xcacls.vbs to only work on folders when querying subdirectories. The parameters /s /t does work across subdirs, but it includes files, which is not what I want!

This does not seem possible, i can not find a combination of switches that does travel subdirectories, but only displays directory permissions and not files too. I get output like:
**************************************************************************
Directory: d:\data\file.txt

Permissions:
Type Username Permissions Inheritance
...


So I had to make a small wrapper, to only run XCACLS on a predefined list of dirs, without using any /s /t. This is not scalable at all!

What I would rather like is a script to get a remote dirlisting, where we can check if a filehandle is a dir, and if it is a directory then call xcalcs. I dont have that yet :-)

A better solution is much better.

New remote scanning requirements for PCI compliance

I have heard there might be new remote scanning requirements for PCI compliance, which assumably means Visa will require a higher level of application scanning that before. Even if it might not be so, it is a good chance to improve the organization IT skills, just as like the original PCI compliance test was a huge improvement.

I am reading parts of the PCI Blog - Compliance Demystified blog, where there are some pointers to documents etc.

In one of the recent PCI Blog newsletters I stumbled upon a some quotes regarding scanning:

Scanning is a snapshot ...
Scanning is diagnostic, not preventative ...
...
In fact SQL Injection, one of the most commonly used methods of
compromise, cannot be detected using scanning.

...
Scanning is a component of the information security program, not a
replacement for it - Scanning can be a useful tool when used as a part of a
robust, well-rounded information security program. Relying on scanning
alone can leave a company dangerously exposed to data compromise. However,
when used in conjunction with timely patch management, strong internal policies
and processes that are actively enforced, data classification and control
practices and other elements of security practice, scanning can provide valuable
insight.


I have to question their statement about "SQL injection can not be found from scanning". As with other vulnerabilities found by scanning, some SQL injections attack vectors can be found. In fact Nessus does a good job of finding some SQL injections, but I have seen Nessus miss SQL injections that was later found by Webinspect. The other points in the newsletter are valid and good to keep in mind!

In the future companies that want to have PCI compliance might be forced by Visa to buy and use either Webinspect or IBM Rational AppScan. Both are very expensive!
The Next Generation of Web Application Scanning
WebInspect
7 is the first and only web application security assessment tool to be
re-architected to thoroughly analyze today's complex web applications built on
emerging Web 2.0 technologies. The new architecture delivers faster scanning
capabilities, broader assessment coverage, and the most accurate results of any
web application scanner available

Open source alternatives for web application scanning tools, that just comes even close to the capabilities of Webinspect and Ration AppScan, would be awesome. Please leave a comment if you have any ideas :-)

Searching your logfiles and your knowledge management sources

A friend of mine pointed me to Splunk for log file analysis, thanks for that :-)

I havnt had a chance to install and try Splunk, but looking around, Splunk could be the util to combine knowledge management searches with real time event searches from servers. A single point of entry for searching is crucial, but not easy to up and running in the day to day use.

To benefit from a search engine, that engine should be able to reach all the different places that people put knowledge. And it must be able to crawl all file formats, eg. Open office, MS office, excel, pdf etc. We can get the file indexing working from all kinda places, but the hurdle seems to be indexing mailboxes! The example being a public mailbox archive of all the support answers to customers, with many years of useful knowledge! Indexing mailboxes, eg. Lotus Notes, should be possible with enterprise search engines like Google and Yahoo Omnifind.

For logfile analysis, i usually stick with simple tools ala fetchlog, our own grep scripts on centralized syslog servers, and some OSSEC. Other utils I have played with for correlating of information is prelude.

Perhaps Splunk can combine the above (search engine and logfile analysis) into one application?

Splunk provides a free edition, so I will keep it around, in case I get a chance to try it :-) It sure seems worth a try for an enterprise! Of course, being an open source and community fan, I am more biased toward an open source alternative for Splunk? Prelude and OSSEC are both open source free software.

While looking around I stumbled upon an interesting open source site, Softpanorama.org:

Mission and Vision Statement This is a self-education oriented site (see
about for more info) that contains resources for the independent study in
computer science and programming. The latter is the area were open source really
shines: the academic value of open source software (OSS) cannot be
overestimated.

Softpanorama.org has some Splunk entries in their Log Analyzers News:

[Apr. 17, 2006] Splunk Welcome
Splunk is search software that
imitates Google search engine functionality on logs. Can be considered as
the first specialized log search engine. It can correlate some
alerts:
Splunk Splunk User's Guide
Splunk Administrator's Guide


[Feb 16, 2006]
Splunk, Nagios partner on open-source systems-monitoring tools
Log file search and indexing software vendor Splunk Inc. announced Tuesday that it will soon add systems
management host, network and service monitoring capabilities to its software
through a partnership with the
Nagios open-source project. ...

Monday, February 11, 2008

Not satisfied with your current Version Control System - discussing switching VCS

freebsd_version_control_system_requirementsAt work we are getting increasingly annoyed by the rather old Visual Source Safe we are using. We are going for AccuRev as a replacement. There is an interesting comparison with Subversion. Their Subversion notes might be true, in the sense taht you do need some scripting skills to take full advantage of Subveresion branching and merging. Perhaps this is what you get for the license fee. AccuRev server does on Windows, Mac and Linux, not sure about BSD flavors. It does not come for free:
AccuRev is typically licensed using a named user license model. The
list prices for AccuRev end-user licenses range from $750 to $1,995, depending
on specific products licensed, number of users, and required integrations with
3rd party products (e.g., AccuBridge)

If you want a good reading of version control system discussion and thoughts, I recommend reading the FreeBSD Wiki on the VCS subject. It is very well written, and touches many aspects of version control (also some you probably didnt think about). Of course it is written with reference to the FreeBSD project needs, but if you are a familiar with FreeBSD branches and ports, and working with vendor code for your self, you might get a lot of knowlegde and ideas from reading it. I found it very interesting :-)

In short it is a discussion of open source version control system alternatives, with description of desired and required features, in order to justify the cost of FreeBSD project switching away from CVS. Is similar to our own thoughts on changing version control system here at work.

Most is written by Peter Wemm, who is vouching for Subversion. Here is a snip from Peter's view on why FreeBSD need a new VCS and why Subversion should be the prime target. Should convince you to start reading :-)
Why does my opinion matter? I've been doing this for a while. For the last 13 years, I've been the 'The buck stops here' guy for our repository. I've seen it all. I wrote the rules about what we can and can't do in the repository. I did the hacks to the cvs system to prolong its use for us. I came up with or implemented most of the hair-brained ideas that we live with on a daily basis.
Here are my snips from my reading through all the sections:

Automated or mechanically assisted merging. FreeBSD's development model requires that (unless it's an exceptional circumstance) changes first go in to the HEAD. If they are suitable candidates to go in to stable then they should be merged to the relevant stable branch.
In addition, new features may first be developed on a separate branch, before being merged in to the HEAD.
The VCS should support easy merging of changes from HEAD (or its equivalent) to the stable branches, and from feature branches to HEAD. Merges should also be able to go both ways, and be easily repeatable (e.g., a long lived feature branch may merge changes from HEAD on to the branch several times, and may merge changes from the branch back to HEAD several times)

Branch, Easy & cheap branches (and history-aware merging) and tags to enable parallel lines of development (that is essential for projects like SMPng which have a very big impact on many source files)


SVN Repo Layout: A proposed repository layout if FreeBSD moves to Subversion. This includes a good suggestion of handling Vendor code.

SVN Merging: A walkthrough of merging changes with Subversion and svnmerge.py. This walkthrough of branching and merging is very educational :-)

ACL, Access control: the ability to constrain developers to operating in specific areas of the tree, implement branch-based policy restrictions, as well as to enforce policy such as tagging of commits for developers working outside their normal areas. Implementing these via hooks would not be a regression from what we currently do in CVS.

Offline, Ability to work offline -- like on a plane -- without requiring too much work: not only being able to list differences but also to commit

SVK which brings history-aware merging and distributed features to SVN

There are some really interesting (biased of cource) quotes when it comes to comparing Git and Subversion conversion going from CVS, which are right on, and makes you think:
For us to switch to svn would be an evolutionary step. We could use it
as a better cvs, with the sharp edges fixed. hg and git require more of a
revolution in the way we go about things.

git/hg make it very easy to take stuff offline....Encouraging the
taking of stuff further offline is going in the wrong direction for *us*. If
anything, we need to make it easier for people to get stuff to us and in the
tree in some form.

Linus wrote git to suit his needs for linux. He has one thing going for us that we don't. There is a large cult of personality surrounding Linus. There is intense pressure to "validate" your work by getting it approved (directly or by proxy) by Linus. On the other hand, we already have problems extracting work from people. We can't assume that we'll get the same inward flow that Linus gets.

From http://lwn.net/Articles/246381/ - there are some choice quotes. The topic is the problems the KDE folks had making git work for them.

We're not Linux. A good number of our best supporters stick with us because we're a coherent tree and not like linux' chaos.

Why do you seem to be pushing subversion?It's because I am. I think the whole hg/git thing is a distraction.

  • it works the same way we've become accustomed to cvs working. Except without most of the silly problems/restrictions.
  • there are a huge bunch of tools out there to talk to svn. Things like svnsync (cvsup for svn repository replication) are out there.
  • We can use live changeset based exporting to export the tree to cvs to maintain HEAD and RELENG_* branches. Our end users will be able to keep doing exactly what they've always been doing for getting their "fix" of freebsd.
  • svk, as an optional add-on, gets you the ability to have a private playground, in spite of my encouragement to work on the public servers.

Notes on Git Conversion: Why git is interesting to FreeBSD, is also very educating. From the little bit of Git reading that I have done, it seems to me that Git gives abilities to hide development cycles, not something I would appreciate in the projects I participate in. Some Git quotes:

git is distributed

Now, you can commit as you develop, then test, then push. If
you find things in your testing that are wrong, you can commit fixes before
pushing, or even go back and edit your local history to erase your mistakes,
making you look even more ninja than you really are.

You can also push your
changes up to a personal repository for others to access. They can merge it to a
personal tree of their own, do repeated merges all sorts of directions, and have
it just Do The Right Thing.



I am a fan of Subversion, and it works on many platforms. So far Subversion has fitted all my needs for version control, automation, documentation, management etc!

After reading the above articles I am even more convinced Subversion will continue to meet my needs, so I am not changing :-) SVK is something for me to try though. And AccuRev might prove useful for the enterprise, we will see.

Wednesday, February 6, 2008

Query MSSQL from perl

I mentioned how to connect to MSSQL from batch, eg. using osql.exe, but today I wanted to do the same from Perl.

There are many samples on Google, using Win32::OLE or Win32::ODBC. Usually finding the right connection string is the hurdle.

For the ODBC connection strings it can look like this:

$DSN = 'driver={SQL Server};Server=$hostname\\$instance;database=$db;uid=$u;pwd=$p;';
if (!($db = new Win32::ODBC($DSN))){ die "Error: " . Win32::ODBC::Error() . "\n"; }

For Win32::OLE connection string with password can look like this:
my $ConnStr="Provider=SQLOLEDB;Initial Catalog=$db;Data Source=$server;User ID=$u;Password=$p;Network Library=DBMSSOCN";

But I really want to avoid the user and password in scripts. So for Win32::OLE connection string integrated security, without password, can look like this:
my $ConnStr="Provider=SQLOLEDB;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=$d;Data Source=$s;use Procedure for Prepare=0;Connect Timeout=4;Trusted_Connection=Yes";
# Provider=SQLOLEDB.1 or Provider=SQLNCLI.1

Testing connection, create a query, execute it, and work with is pretty straight forward:
my $Conn = Win32::OLE-> new('ADODB.Connection');
$Conn-> Open($ConnStr);
my $err = Win32::OLE::LastError();
if (not $err eq "0") { print"FATAL: no connection, OLE error 0x%08x: $err\n"; exit; } else { print"Connected OK\n"; }
my $Statement = "select servername from servertable where x = 0 AND id = 11";
if(! ($RS = $Conn->Execute($Statement)))
{ print Win32::OLE->LastError() ; exit; }
while (! $RS->EOF) {
$servername= $RS->Fields(0)->value;
print"servername is: $servername\n";
$RS->MoveNext; }
$RS->Close;
$Conn->Close;

Just for future reference the ODBC SQL update code could look like this:
$SqlStatement = "insert into dbo.MyTable values (\'$var1\',$var2,$number,getdate())";
if ($db->Sql($SqlStatement)){ print "Error: " . $db->Error() . "\n"; $db->Close(); exit; }