Junk Drawer – A quick and finally simple way to import misc data into SQL

Dale Newman’s post on Junk Drawer caught my eye and I put it on my To Read list (basically one of my browser tabs, mocking me until I got to it). I finally took time to read it and boy am I glad I did.

This definitely goes into my Bat Utility Belt of dev/dba tools. 🙂

Frequently I need to throw some data into SQL to query against, join against production data, etc, for instance when someone from another department throws a spreadsheet of staff at me and asks if I can add their email addresses. A quick join against our export of Active Directory and away I go.

However, getting this data into SQL is usually a mind numbing dance using the SQL Import Wizard, fretting that I’ll get a truncation error or assigning a column as an Int when clearly there was a single row on line 1765 that happened to have a decimal point. Ugh.

Junk Drawer is a tool that any of us should have thought of, and probably did, but Dale actually created. The article was a real pleasure to read. Dale is a fun and true programmer in that “good” techniques like dependency injection just don’t always make it into one-off toolbox tools. It’s the perfect mix of humor and real-world practicality.

Thanks Dale! If you ever quickly import misc data into SQL you own it to your sanity to check this out. 🙂

How to Create Great Rounded Corners on Images with an Alpha Mask and Paint.Net

For a recent website I was working on I wanted to have some nice looking rounded borders for the main body of the page and some of the images.  Since I’m an avid jQuery user it was a no-brainer to use the jquery.corner plugin.  It looks great and works across multiple browsers.  It defaults to using more modern CSS styles for newer browsers but falls back on some neat tricks for older browsers.

Unfortunately the tricks don’t always come out perfect and some users with older browsers complained the image corners were “jaggy”. Rather than tell any users they need a newer browser I decided to make it look nice without the hack, by actually creating images with rounded corners. Yeah, we’re still in the early 2000’s sometimes. Smile

Paint.Net is a great tool that meets many of the needs for web developers.  Naturally this was my first stop.  There are several great tutorials for corners and even some nice plug-ins already created, however, none of them seemed simple or did what I wanted.

For instance, here’s a great tutorial from Micah Sewell on manually creating rounded corners but, unfortunately, it forces you to use the rounded rectangle tool, which has no options for changing the corner size or getting it positioned exactly, nor does the process described create antialiased corners, an absolute must.

Here’s a great plugin by Tim Mathais for creating rounded rectangles for things like buttons, but this doesn’t work on an image that already has a background.

What I would rather do is draw upon the power of alpha masks.  If you don’t know what these are take a look at this great Wikipedia article.  In essence you use a grayscale image to change portions of your desired image opaque or transparent.  If you have a grayscale gradient from the top to the bottom and apply it as an alpha mask to your image your image will go from opaque to transparent from the top to the bottom or vice-versa.

Using Tim’s plugin and the Alpha Mask Import Plugin we can get exactly what we want: user customizable rounded corners on images with antialiasing.  Tim’s plugin allows great control over your borders and the coloring used for the mask.  The Alpha Mask plugin allows great easy features like pulling in the mask from the clipboard rather than having to save it first, which is exactly what we’ll do.

To do this simply:

  1. Load the Rounded Rectangles and Alpha Mask Import plugins into Paint.Net.
  2. Open your desired image in Paint.Net.
    image
  3. Create a new layer and switch to that layer for drawing.
    image
  4. Using the Rounded Rectangle plugin create a new rectangle with your desires properties.  To make this an alpha mask it needs to be grayscale so make sure you don’t use color.
    image
    SNAGHTML3f215674
  5. This should place a rectangle with your desired settings on the second layer.  My rectangle is simply black with with a radius of 20 and antialiased.
    test mask2
  6. Select the entire layer and copy it to the clipboard.  Now that the mask is in memory you can either delete the mask layer or simply hide it.  I sometimes make mistakes and hate redoing work so I just hide mine until I am sure that I’ve got what I need.
  7. Now, on your original image simply use the Alpha Mask plugin and it will default to pulling the mask you copied into the clipboard.  You may need to check the Invert Mask option depending on how you created the mask:
    SNAGHTML3f1e0430
  8. Hit OK and you’re all set with great results!  This process only takes a few seconds once you get it down and it is very customizable. 
    test mask

Enjoy! Winking smile

What’s the shelf life of a techie?

A valued colleague just sent me this interesting article:

http://economictimes.indiatimes.com/news/news-by-company/corporate-trends/whats-the-shelf-life-of-a-techie-just-15-years/articleshow/17251620.cms

It proposes that the shelf life of a techy is only 15 years.  One leader interviewed claims that the 20 something’s in an organization provide more value than the 35 year olds.

Definitely interesting and provocative.

I’ve had these same fears (now that I am 35 myself). I remember what I used to think of my “elders” back when I was in my early 20’s. They were behind the times and couldn’t seem to mentally get out of 1st gear. Now that I’m the mid-thirty guy I have a sinking suspicion that the 20’s are looking at me this way. What if it is true?

“’The shelf life of a software engineer today is no more than that of a cricketer – about 15 years,’ says V R Ferose, MD of German software major SAP’s India R&D Labs that has over 4,500 employees . ‘The 20-year-old guys provide me more value than the 35-year-olds do.’”

Yeah, but Ferose forgets that he’s still the one running the company. He brings a global vision and makes use of the ideas and products of the 20 and 35 year olds. There’s more than knowing the intimate detail of every new thing out there. You always need your base of people walking on the edge, and you need to listen to them.

As a leader (as you are Gina at your site and in this district) I realize this is a matter of perspective and willingness to stay in the game. If I’m willing to learn from what the new trends are and use that as a platform for what our vision and goals should be then we can keep up. It will take some work but that is a basic tenant of the industry we are passionate about. Gone are the days when IT should be deciding in a vacuum what hardware is the standard. It’s no longer about stability, consistency and what we can support with the fewest number of people. It’s about what resources and capabilities our users need to do their job in a modern world. If a 7 lb. laptop that only connects to Windows doesn’t cut it, then we need to give our users the opportunity to use whatever device allows them to work in the way they need. This is driving my vision towards iPads, tablets, Chromebooks and personal devices on our network, all allowing teachers and students to use tools that enable them to collaborate such as Haiku, Google Docs, My Big Campus, Edmodo, etc.

However, I also have to put my “elder” cap on. I need to inspire and cultivate innovation in a way that is supportable with few people, financially reasonable and without overwhelming our staff/students with so many choices that they can’t even make a decision. This is what I call Managed Innovation. I’m still working on that term. Anyway, rather than opening the Web 2.0 floodgates we provide a small set of highly capable choices for each solution space, i.e. Haiku and My Big Campus or Microsoft Office and Google Docs. This allows us to continue to provide support and training while also offering flexibility. When we identify two options, both which solve the same problem but in different ways, we allow choice while still offering support such as trainings, integration with our SIS and still ensuring compatibility with the software, hardware, and network we provide.

I’ve seen the benefits and eventual defeats of both extremes. In an effort to provide maximum reliability and efficiently some environments are so controlled that they cannot adapt to new requirements. Eventually they lose focus on the goals of the organization and build resentment among the captive users. At the other end some organizations remove all barriers, hoping to foster (and initially successfully creating) a fluid and flexible environment, taking advantage of the latest gadget or resource. Unfortunately, this in turn causes an organization where staff who used to collaborate can no longer talk the same language. Their documents are stored in different mediums, they use platforms that cannot integrate, staff entering this new flexible environment don’t know what tool to start with and the ability for colleagues to train and support each other no longer exists. Worse yet, the students bear the greatest burden being forced to learn how to communicate in which ever online platform each of their teachers desire to use. Rather than becoming a skilled native they are a traveling foreigner. When the staff seek out help and guidance, without a consensus among the district the IT department no longer can offer support nor even guarantee that the equipment and resources they are tasked to provide can integrate with basis systems like their SIS or is even compatible with all the different online tools available.

Yes indeed gone are the days where we could rest on a foundation that only changed every 5 years. But rather than lose hope we need to run pilots, test out new systems, see what our trail blazing teachers and students settle on that works for them and then support only the top tier initiatives. The explorers know they are in new territory and there may be bumps and even u-turns along the way. But give them the ability to try. When they find the next incredible tool learn from them, support them and help steer the rest of the organization in those directions.

Should SSMS be built into Visual Studio?

SQLServerCentral sends out a daily email and today had an interesting offer to take a survey from redgate.  They were asking questions of Visual Studio developers that heavily use/interact with databases.

If you fit into this crowd help out by taking their survey here.

Initially I really distrust the idea of having SSMS like functionality within VS.  I know it is already there but I just have really avoided them, though I try to use them from time to time just to give them the benefit of the doubt.

For some reason Microsoft tends to like to dumb down anything in VS that’s not strictly developer oriented.  This is a huge separation from their current “give the power to the programmer” mentality when it comes to features like VS Add-Ins, NuGet, T4 templates, EF4 Code First, etc.  For some reason DBA tools have fallen into this “black magic, we’ll hide it for you” dark side of VS.  So, since VS 2003 and probably a little before, any database interaction was kept at a very high level.  You simply couldn’t dive right into necessary tools like T-SQL easily.

Consequently I’ve always developed with VS and SSMS side by side.  So, I’m in that old school stick in the mud crowd waiving my cane in the air shouting, “I’ll give you my SSMS when you pry it from my cold, dead hands!” Call me old fashioned.

As SSMS just continues to improve (throw SSMS Tools and other indispensible add-ons in the mix) I have had no desire to even attempt DBA functions within VS.  Honestly, SSMS is becoming so nice to use it is pretty much on par with my respect of VS 2010 as the developers IDE.  SSMS is the DBA’s tool of choice for me and quite happily so.

If they literally took SSMS and put it into VS I think I’d still avoid it.  The need to Alt-Tab between the two really creates a mental context switch in my head that helps my productivity.  Read my post on Alt-Tab Aids My Mental Context-Switching for more info on this, but it is a real productivity asset to think about.

If switching between SSMS and VS tools (literally and mentally) all inside the IDE were any more complicated than a simple Alt-Tab I would not be inclined to switch. 

It would need to offer enough extra "can’t live without" features to overcome the current simplicity of Alt-Tabbing between VS and SSMS.  For instance, I would love the idea of being able to code against a dev database and having the IDE help generate a change script, all which would be kept in source control.  The entire database design should be kept in my source control as well, so that when I add a column or an index the change script and create script are entered on the next commit.

Alt-Tab Aids My Mental Context Switching

A recent blog post about combining the functionality of SQL Server Management Studio within the Visual Studio IDE got me really thinking.

Any any given moment I have at least 6-8 applications running, usually quite a bit more.  When developing an app I usually have a work environment consisting of VS 2010, SSMS, Firefox (with at least 6 tabs open), Outlook, Word, etc.  Each application is only an Alt-Tab away.

I initially rebel at putting SSMS DBA features into VS2010 (yes, I know they are already there, I just don’t use them).  Much of this has to do with the fact that historically these tools have lacked functionality, but I now realize a large part is the mental context switching that Alt-Tab provides.

If someone walks into my office or a phone call grabs my attention I can easily Alt-Tab to another app, or simply Ctrl-T to open a new tab in Firefox (which starts at Google) ready to handle that particular request.  It doesn’t matter what I was doing before.  For some reason Alt-Tab simply puts my current mental state on the stack (forgive the metaphor Winking smile) ready to be called up when I Alt-Tab back sometime in the future.

Switching between different duties using Alt-Tab seems to really trigger a context switch in my head.  All this happens in the .2 seconds it takes to press Alt-Tab.

While DBA duties are really integrated into my development process (in my work I am the developer and DBA) I love the clean separation of concerns when I Alt-Tab between VS and SSMS.  Even if SSMS were completely duplicated within VS I don’t think I’d care for it.  There is just such a satisfaction with mentally putting on my DBA hat for SSMS work, even if it is simply for 20 seconds while I add a column to a table.  It just seems cleaner.  Unless the integration of the two tools really provides support that each tool individually couldn’t accomplish I actually see less value in combining these.  Additional features, such as integrating source control over my database assets and generating data migration scripts, would prepare me to mentally believe these two tools really should be combined.

Now I guess I need to start to evaluating add-ons and other tools based on how easily I can task-switch the context in my head when I need to use them.

Backing up to iCloud Just Doesn’t Fit for the 99%

We use iPads within our organization and the use is growing.  This starts to present IT support concerns, especially when major iOS updates like iOS 5 come out. 

iCloud is touted as being a great answer to several issues because it allows you to backup devices over the Internet.  I agree that this is a great idea for consumers and potentially for enterprises, but at the moment it doesn’t live up to the desire initially. 

Doesn’t Work for the Enterprise Yet

Unfortunately iCloud really doesn’t seem to be a solution for enterprise. 

iCloud might work for carts if:
•    It wasn’t limited to 10 devices (Apple doesn’t intend this for enterprise use? Not even a single cart?)
•    It could back up a single image to the cloud, that could be restored to any number of devices, such as 30 devices in a cart.  No sense in backing up numerous duplicate images.
•    Restores could be done in batch without having to start it from each iPad individually (now we’re back to using iTunes on the cart)

Doesn’t Work for the Average Consumer Yet

Regarding both the enterprise and consumer points of view iCloud’s initial 5GB may seem generous compared to other cloud offerings that typically only give 2GB, but for a standard iPad that holds 16GB the initial 5GB gets filled quickly.  Worse yet it gets filled without any real knowledge of what’s going on.  With Dropbox and others I am quite aware when I put large files into it.  With my iPad I may download a few apps, take a few videos, and suddenly my iCloud storage is out of space when I had a gig available yesterday.

It gets worse when our users are backing their phone and iPad to the same iCloud account. Quite quickly they are subjected to needing to upgrade their iCloud storage.  $100 a year (for one iPhone and one iPad) is pretty hefty. 

Not to say that Apple’s pricing isn’t in line with other similar cloud storage vendors, it just doesn’t seem to translate well to “iDevices”.

Works for the 1% (Occasional Low-End User)

The iPad is great for my grandparents, who currently have a WebTV.  They get around the Internet great but I don’t want to give them a computer that you have to update, keep the latest codecs, prone to failure, etc.).  An iPad would be a great fit.

They probably won’t be power users, will play with occasional apps and would get a great kick out of sharing a photostream or using FaceTime.  I think they would easily fit into the 5GB iCloud capacity and take great advantage of updating to future OS versions over the air.  Right now I think this is the best target market for iCloud.  Unfortunately it’s a very small target market.

I hope that Apple will address these issues in some way in the next few “versions” of their iCloud support.  I have no doubt that they can and will.  As the next couple of years go by cloud vendors will offer more and more space and lower prices and interoperability with more devices.

A Quick Script to Help Update Your Current Student Photos in Destiny from Aeries

At my district we use Aeries, our Student Information System from Eagle Software.  We also use Destiny, our district-wide library system from Follett Software.  Both are great products and, better yet, both are housed on our Microsoft SQL Servers.  This allows us another level of integration since we can get to the raw database for various needs.

One way we take advantage of this is how we load our student photos into Destiny.  Destiny takes a zip file, which contains jpeg files of the students as well as a idlink.txt file, which is an index file that ties the jpeg to the student’s record.

I’m subscribed to the Follett Talk list serve, which is an email list that several Follett Destiny administrators are a part of and every day there are great discussions and help from various users around the country.

When I mentioned that we had a script to help out when updating student photos in Destiny, several were interested .  I’ve been trying to write up a formal blog post on this with great screenshots and directions but, alas, I haven’t had the time.

Anyway, one of our neighboring libraries politely called me about it this morning and I thought I should just put it out there as is for anyone who can use it.

The SQL script file and a quick documentation document are attached to this post.  Get them below.

What it does is grab all the photos (from the school sites you designate) from the server you store your Aeries photos on (this used to be our ABI server but is now our Aeries.Net server), generates the idlink.txt file and puts them all in a zip file for you.  Since it pulls the photos straight from the information in Aeries and also generates the idlink.txt automatically I always get the latest photos (such as from a picture makeup day or newly enrolled students) and also any corrections that may have been put into Aeries.

The script runs in less than a minute for our entire district and then I have a nice easy zip file to upload to Destiny.

There is a little documentation attached but whomever will do this procedure at your site will need some foundational knowledge of Aeries and SQL.  This is not for a librarian at a site that does not have knowledge/access to the Aeries server.

This does not write to any data in the Aeries database so there is no harm in running it.  If there is an error in the script (for instance, you forgot to add your proper school sites and server name) it will simply give you an error but will not cause any changes to data, so don’t worry. 

I hope this helps!

P.S. For those of you who also use BlueBear I had two extra lines of code on the script.  We use BlueBear as our ASB accounting system for our high schools.  It takes photos in the same way but needs them unzipped, all just stored in a single folder.  The last line in my script would unzip the results from above into a folder per school site for our high schools.  This way I would just run the Student Photo Update process in BlueBear and point it at the folder for the correct site.

If you have BlueBear here are the two extra lines I use:
print ‘ECHO Extracting photos for Blue Bear…’
select distinct ‘”c:Program Files7-Zip7z.exe” e c:pics’ + cast(stu.sc as varchar) + ‘Pics.zip -oc:pics’ + cast(stu.sc as varchar) + ‘  >nul 2>&1’
from stu
join #pics on (stu.sc = #pics.sc and stu.bm = #pics.shortname)
where del = ” and tg = ” and bm <> ”
and stu.sc in (310, 320, 410, 430)

Destiny Student Photo Import Quick Documentation

Student Photo Queries.sql

Virtual Desktop Savings is not in the Hardware

Here’s a great article that I recently was referred to:

Citrix: Virtual desktops about to become cheaper than physical ones

In it Citrix is helping to develop yet another lower cost zero-client device.  These are literally just a small box about the size of a deck of cards that have ports for a monitor, Ethernet and USB.  No moving parts, very slim, reliable and inexpensive. However, I still maintain that the savings of Virtual Desktop (VDI) are not in the hardware.

I think the real savings are in staffing and time.  Unfortunately these are much harder to measure but well worth the move in my opinion.

You can already use the hardware you have for VDI so new zero-clients only save you money for new installations or when you are replacing a desktop that is completely dead. 

You still need a keyboard, mouse and monitor (video) to hook up to zero-clients and Windows/Office & CAL licenses.  If you subtract the cost of the license & kvm these days you can purchase a reliable desktop for probably about $500, and even less if you plan to purchase refurbished and replace them more often.

So, initially you do probably save $300-$400.  However, you must move these virtual desktops to a server you never had to have before in the data center.  More than likely you are moving them to a high-capacity set of servers, much like the Cisco UCS system we are rolling out out my district.  This is very expensive.

It doesn’t stop there.  My network manager and I attended a VDI training and learned that you have to make sure your network can also withstand the high traffic all the way from the data center to the user’s “desktop”.  It was fine if a user had to wait a few seconds to open a Word doc but waiting a few seconds to click a mouse is unacceptable.  Suddenly you may have to upgrade much of your entire network infrastructure.  This is much the same issue you encounter when you first migrate to IP phones.

Oh, and what about disaster recovery?  With Virtual Desktop you have effectively moved their “desktop” to a data center.  Right now at our district if a desktop dies we advise the user to log onto another one nearby and schedule a tech to replace their desktop within 24 hours.  With Virtual Desktop, if the connection to the data center goes down there is no nearby desktop to go to.  So you have to have a disaster recovery data center (i.e. at our District Office).

This starts sounding really expensive really quick.  Fortunately VMware (et al) have great tools that let you leverage the most out of your equipment.  A classic disaster recovery site is unused most of the year and only kicks in when you have an emergency.  From a business cost perspective it is a loss until those few minutes it is in use.  VMware lets you cluster your data centers together effectively using them in tandem for full production load balancing.  If one were to go down the other would simply take the entire load.  You still have to purchase double the equipment but it is used much more efficiently.

I would imagine that as you scale the costs start becoming more relative.  I would think that if you implemented VDI on 90% of the computers at a site it is much more economical than, say, 50%.

Like I said above, I think the real savings are in staff and time.  If you can reduce the amount of equipment you have to service while increasing its reliability you don’t have to hire additional techs as you grow your desktop base and your current techs can also deal with higher level issues.  Same thing with your network services staff at the data center.  In addition, your users as a whole experience far less downtime and a far more reliable working environment.

How do you measure those benefits?  Theoretically you can measure the amount of technology growth vs. payroll, but you really can’t measure downtime for the users unless you have a very sophisticated ticket system where you can somehow quantize and compare downtime for users.