Earning Dividends on Your Mistakes

We often think of mistakes as horrible things. We label them bad, negative or as a failed action. However, it’s possible to learn a lesson that brings value from our mistakes. In reading a book published in 1911, titled How to Systematize the Day’s Work, I came across the following excerpt:

Dividends on Mistakes

A mistake may be made the keystone of system – the foundation of success. The secret is simple: Don’t make the same mistake twice.

The misspelling of a customer’s name – an error in your accounting methods – an unfulfilled promise; these are valuable assets if they teach you exactness.

Let your mistakes shape your system and your system will prevent such mistakes. When you discover a mistake, sit down then and there, and arrange the system to prevent its repetition.

Paint it on your walls; emblazon it on your door; frame it over your desk; say it to your stenographer; think it to yourself; burn it into your brain; this one secret of system, this one essential to success: DON’T MAKE THE SAME MISTAKE TWICE. (emphasis original)

As I read through this section, I couldn’t help but think about the years of teaching I’ve delivered on documentation and its importance to effective troubleshooting and operations and also the process of becoming an expert. This concept of learning from your mistakes is a big part of becoming an expert and it is a significant factor in becoming an effective technician. Ineffectiveness is often born out of the ignoring of our mistakes, which results in their repeated occurrence. What an excellent insight to begin the new year!

WLAN (Wireless LAN) Administration Guidelines

Best practices provide a foundation on which to build specific policies and procedures for unique organizations. Wireless networks do not necessarily require the reinvention of administration best practices. Several best practices can be borrowed from wired network administration including:

  • Configure devices offline
  • Backup configurations
  • Document changes
  • Update devices periodically
  • Perform analysis occasionally

Configuring devices offline provides two major benefits: improved security and greater network stability. Security is improved because the new device is not connected to the network until it is configured according to organizational security policies. Stability is improved because devices are added to the network only after they are configured to operate properly within the network. This best practice should be part of any IT organization’s operational procedures.

Initial device configuration can take anywhere from a few minutes to a few days. As a wireless technology professional, you will want to avoid unnecessary manual reconfigurations. The best way to avoid this extra work is to backup the configuration settings for any device that provides a backup facility. Many devices allow you to save the backup to a file that is stored separately from the device and some devices allow only internal backups that are stored in the memory of the device. While the external backup is preferred, the internal backup should be utilized if it is the only method supported. Even with modern “centralized” WLAN technologies, something has to be backed up (for example, the controller or the cloud) by somebody (for example, you or your service provider).

Device configurations are often modified several times over their lifecycle. It is not uncommon for a device to be modified more than a dozen times a year. These configuration changes should also be saved to a backup. If the device supports it, I usually backup the initial configuration and then backup the modified configuration to a separate backup file. However the backup is performed, it is important to backup the changes as well as the initial configuration. As much as we talk about the importance of documentation, IT professionals seldom document minor changes they make to device configurations. These minor changes add up to a big difference over time and the easiest way to document them is to back them up.

Finally, occasional analysis of the network will allow you to determine if it is still performing acceptably. On wired networks, administrators spend most of their time analyzing the performance of the network from a strict data throughput perspective (though security monitoring is also frequently performed and occasional troubleshooting tasks). On wireless networks, the issue of coverage must also be considered. Are the needed areas still receiving coverage at the required data rates? If you look only at the throughput at the APs, we may miss the problems occurring in coverage patterns. If you look only at the coverage, you may miss problems related to throughput. Both are important.

In addition to these practices borrowed from the wired networking world, wireless networks introduce new guidelines. These wireless-specific guidelines include:

  • Test the RF behavior after environmental changes
  • Update security solutions as needed
  • Remove configurations from decommissioned devices

The first wireless-specific guideline is really a subset of the wired best practice of occasionally performing analysis. As I stated previously, wireless networks introduces the need to look at more than throughput metrics at the port level. We must analyze the RF behavior and ensure that coverage is still provided where it is needed. This extra requirement is driven by the nature of RF communications. Aside from implementing enterprise-class monitoring systems, the small business or home office will require occasional analysis and adjustments based on the results.

Wired and wireless networks require updated security solutions, but if history is our teacher, wireless networks may require such updates more frequently (though the last five plus years have honestly been mostly silent in this area as WPA2 has proven very worthy so far). The nature of wireless communications allows for attacks to be made without physical access to the premises. This fact may be the reason behind the more rapid discovery of vulnerabilities. WEP was shown to be flawed in less than three years. WPA and 802.11i have a backward compatibility weakness when using TKIP that may allow for ARP poisoning or Denial of Service attacks and this weakness was discovered within five years of ratification. The problem is that these solutions (WEP and 802.11i) are intended to provide wireless with security at or greater than the level of a wired network (WEP stands for Wired Equivalent Privacy) and yet they do not always achieve it. Since new exploits are discovered periodically, we may be forced to change the security solution we’re using every three to five years (though the past several years have proven greater general stability). I am using a wired Ethernet port right now that was installed more than ten years ago – no security changes have been needed to meet the level of a physical port because it is, well, a physical port.

However, this issue of meeting wired equivalence may be less of an issue than the level at which it is often presented. Do we really need to ensure that our wireless links are equivalent to our wired links? Not if they are used for different things or if we can provide effective security at higher layers. For example, some organizations require IPSec VPN tunnels for any wireless links that connect to sensitive data, though this has become far less common today with the strength of WPA2.

Finally, since the security settings of the wireless network are often stored in the APs and client devices, it is crucial that you remove the configuration settings before decommissioning the hardware. If you leave the WPA passphrase (used with WPA-PSK) in the device’s configuration settings, the next person to acquire the equipment may be able to retrieve the information and use it to gain access to your network. The likelihood of this occurring is slim (very slim), but it doesn’t take long to remove the configuration and it is common for machines to be wiped before decommissioning them anyway.

These guidelines give you a good starting point. Do you have additional recommendations?

You Ate Your Cheese

“Who Moved My Cheese?” is one of the most popular books in history that addresses change and how to cope with it in your life; however, I would suggest that, for IT professionals and many others, we need an eye-opening, honest book with a title more like, “You Ate Your Cheese.”

You see, the point is that most of the career challenging and life altering work-related changes that occur can be predicted in the technology sector. For example:

  • If you still desire to be supporting Windows 3.1 computers, you ate your cheese.
  • If you still think modems are the best way to connect to the Internet, you ate your cheese.
  • If you think dBase is a modern database, you ate your cheese.
  • If you think Apple is the winner in the mobile phone space, you ate your cheese.
  • If you think InfoSeek is the best search engine, you ate your cheese.
  • If you think Colorado Jumbo 250 tape drives are still a good backup solution, you ate your cheese.
  • If you think Zip drives are the greatest external storage solution ever made, you ate your cheese.
  • If you think 802.11b wireless is fast enough, you ate your cheese.
  • If you think you can control every device users bring into your environment, you ate your cheese.
  • If you think Commodore will make a comeback, you ate your cheese.
  • If you think Windows XP is here to stay, you ate your cheese.
  • If you think Mac OS X will win the OS wars, hmmm, let’s wait and see.

OK. This should be enough to make the point. You eat your cheese when you stick with the knowledge you have and do not grow and learn with the industry. If you think you can master a technology and then just work with that for 20 years, you’re in the wrong industry. I suggest that you consider returning shopping carts to their storage locations at the local departments store. It’s one of the few jobs I know of that is still pretty much like it was 20 years ago. Even in that job, many facilities now have motorized cart pushers to ease the strain on the staff.

Do you see the point? You must continue learning in practically all jobs these days and this is particularly true in IT. If you find yourself in a situation where your skills are no longer in demand, no one moved your cheese, you ate your cheese. It’s time to become cheesemakers and not just cheese eaters. When you use up all the skill you have, it’s often too late to develop new skills. Cheesemakers develop skills continually. Certifications are a great way to do this, but simply learning new skills that you can apply for your current employer or customers can be a great way to evolve over time so that you never get into a situation where you’ve eaten your cheese.

So, the next time someone tells you that someone else moved their cheese, just look them in the eye and kindly say, no, you ate your cheese.

NOTICE: This post is not intended to cover all scenarios in life and is likely to have missed many situations where cheese is indeed moved by a third party. In such situations, advice from books like Who Moved My Cheese? may indeed be helpful. Individuals should consider this post to be advice only and not a medical, physical, emotional or psychological solution to the trauma induced by the moving of said cheese.

802.11 in the Search Cloud

An excellent keyword research service is offered by KeyWordEye.com and one of its features, even with a free account, is the creation of a search volume word cloud. The larger words are the more commonly searched words in relation to the keyword on which you build the word cloud. The following image is the resulting word cloud based on the keyword of “802.11.”

The 802.11 Keyword Cloud

The 802.11 Keyword Cloud

The lessons we can learn from search volume are tremendous. Over time, we can discover trends and at any moment we can see what people are interested in. Now, keep in mind, people search for things for many reasons, including:

  • Purchasing products
  • Learning how stuff works
  • Looking for definitions
  • Clicking a pre-built search link

Whatever the reason for searching, it is interesting what made it onto the search cloud list. Here are a few that really got my attention:

  • 802.11ac – I expected this to be there, but was happy when it was confirmed.
  • 802.11b – I was amazed how often this is still used as a search word. The word alone and included in other phrases totals more than 3000 searches at Google US each month alone. This does not include other search engines. Interesting!
  • 802.11n frequency – search phrases like this, being used in large amounts, reveal the technical proficiency of the audience. Certainly, many people out there are looking for more technical information.
  • 802.xx wiki – Do you want to attract people to your wireless vendor website? Maybe it’s time to start a wiki. Did you see that it is used in phrases including 802.11ac, ieee, and others. Take advantage of it.
  • 802.11g vs 802.11n/802.11n vs 802.11g – People still want to learn about 802.11n compared to their older 802.11g hardware. Tell them what they want to know.

These are just a few insights, I’m sure you can locate more. The most important thing we techies need to learn from this: keyword research can be useful in helping us focus on learning and explaining in-demand technologies (instead of the ones we THINK people should be using).

Three Steps to Becoming an Expert

NOTE: This is an article I wrote several years ago. I hope you enjoy!

Have you ever noticed that experts make more money than generalists? That’s because they specialize and generalists generalize. Or, as Zig Ziglar says, they are a wandering generality.

How did I become an expert in certain areas? How have others always done it? It’s really simpler than you may think and I’m going to reveal it to you in this brief article.

There are three easy steps to becoming an expert:

  1. Choose the Expertise
  2. Make Your Knowledge 90/99
  3. Tell Them What You Know

Let’s look at these three steps individually.

Choose the Expertise

The first step is really the hardest. You wouldn’t think so, would you?

The reason this step is the hardest is because it is the step that the other two are built on. If you ever decide to change your mind about your expertise, it means learning all over again. Therefore, you should put lots of energy into this step.

So, how do you decide on your expertise? Look at what you love and enjoy.

Do you like fishing? Become an expert at bass fishing in the lakes of northern Ohio.

Do you like gardening? Become an expert in growing African flowers in American soil.

Do you like politics? Become an expert in inaugural addresses and their impact on the presidential term.

Notice that I took a generality and made it a specific. You should do this too. I am not just an expert in the field of computers, I also specialize in technical communication skills. I am a general expert in computers/networking and a specialized expert in technical communication skills.

Make Your Knowledge 90/99

I state this when teaching classes on personal growth and I am often asked what I mean. Well, that’s the intention of the statement – to get you to ask.

Here is the answer: You should know more than 90% of the people about your general area of expertise and more than 99% of the people about your specific area of expertise.

Remember that I am a computer expert specializing in technical communication skills. I know more than 90% of the people when it comes to computers, but I know more than 99% of the people when it comes to technical communication skills.

How do you accomplish this level of knowledge? Read, read and then read some more. Go to training classes. Read at least 5 books on the topic. Subscribe to and read 2 or 3 magazines on the topic. Attend 2 training classes per year on the topic. Get experience with the topic.

If you do these things, you will definitely be a 90/99!

Tell Them What You Know

You have to tell people what you know or they won’t know you know.. ya know?

The easiest way to tell your peers and managers (or anyone else) what you know is to put it in writing. Write tips and articles for the company employees (like the one you’re reading and enjoying now).

Depending on your desired goal, you may consider writing magazine articles and offering them for free to various publications. Start a blog on the area you’ve chosen. These days, it’s one of the most powerful ways to become known as an authoritative expert. You may even decide to go for the gold and write that book!

Summary

If people look at you as an expert, they will respect your opinion much more. As a matter of fact, if they don’t look at you as an expert, they probably won’t even listen to what you have to say.

In order to become an expert you must first determine the area of expertise you desire. Then come up with a specific area of that expertise to become even more knowledgeable in.

Focus on the 90/99 rule. Make sure you know more than 90% of the people in your general expertise and more than 99% of the people in your specialized area of expertise.

Tell people what you know through articles and tips. Go for the big one and write a book. Do what it takes to get your name out as an expert.

Yes! You can be an expert!

-Tom Carpenter

The Importance of Data Classification (Information Classification)

The importance of security varies by organization. The variations exist because of the differing values placed on information and networks within organizations. For example, organizations involved in banking and healthcare will likely place a greater priority on information security than organizations involved in selling greeting cards. However, in every organization there exists a need to classify data so that it can be protected appropriately. The greeting card company will likely place a greater value on its customer database than it will on the log files for the Internet firewall. Both of these data files have value, but one is more valuable than the other and should be classified accordingly so that it can be protected properly.

Data classification is the process used to identify the value of data and the cost of data loss or theft. Consider that the cost of data loss is different than the cost of data theft. When data is lost, it means that you no longer have access to the data; however, it does not follow automatically that someone else does have access to the data. For example, an attacker may simply delete your data. This action results in lost data. Data theft indicates that the attacker stole the data. With the data in  the attacker’s possession, the attacker can sell it or otherwise use it in a way that can damage the organization’s value. The worst case scenario is data theft with loss. In this case, the attacker steals the data and destroys the copies. Now the attacker can use the data, but the organization cannot.

When classifying data, then, you are attempting to answer the following questions:

  • How valuable is the data to the organization?
  • How valuable is the data to competitors or outside individuals?
  • Who should have access to the data?
  • Who should not have access to the data?

It might seem odd to ask both of the latter two questions, but it can be very important. For example, you may identify a group who should have access to the data with the exception of one individual in that group. In this case, the group should have access to the data, but the individual in that group should not, and the resulting permission set should be built accordingly. In a Microsoft environment, you would create a group for the individuals needing access and grant that group access to the resource. Next, you would explicitly deny access to the individual who should not have access. The denial overrides the grant and you accomplish the access required.

Many organizations will classify data so that they can easily implement and maintain permissions. For example, if data is classified as internal only, it’s a simple process to configure permissions for that data. Simply create a group named All Employees and add each internal employee to this group. Now, you can assign permissions to the All Employees group for any data classified as internal only. If data is classified as unclassified or public, you can provide access to the Everyone group in a Windows environment and achieve the needed permissions. The point is that data classification leads to simpler permission (authorization) management.

From what I’ve said so far, you can see that data classification can be defined as the process of labeling or organizing data in order to indicate the level of protection required for that data. You may define data classification levels of private, sensitive, and public. Private data would be data that should only be seen by the organization’s employees and may only be seen by a select group of the organization’s employees. Sensitive data would be data that should only be seen by the organization’s employees and approved external individuals. Public data would be data that can be viewed by anyone.

Consider the following applications of this data classification model:

  • The information on the organization’s Internet web site should fall in the classification of public data.
  • The contracts that exist between the organization and service providers or customers should fall in the classification of sensitive data.
  • Trade secrets or internal competitive processes should be classified as private data.

The private, sensitive, and public model is just one example of data classification, but it helps you to determine which data users should be allowed to store offline and which data should only be access while authenticated to the network. By keeping private data off of laptops, you help reduce the severity of a peer-to-peer attack that is launched solely to steal information.

This data classification process is at the core of information security, and it can be outlined as follows:

  1. Determine the value of the information in question.
  2. Apply an appropriate classification based on that value.
  3. Implement the proper security solutions for that classification of information.

From this very brief overview of information classification and security measures, you can see why different organizations have different security priorities and needs. It is also true, however, that every organization is at risk for certain threats. Threats such as denial of service (DoS), worms, and others are often promiscuous in nature. The attacker does not care what networks or systems are damaged or made less effective in a promiscuous attack. The intention of such an attack is often only to express the attacker’s ability or to serve some other motivation for the attacker, such as curiosity or need for recognition. Because many attacks are promiscuous in nature, it is very important that every organization place some level of priority on security regardless of the intrinsic value of the information or networks they employ.

That Pesky SSID and Your Wireless LAN

The service set identifier (SSID) is meant to differentiate networks from one another. The default SSID should be changed on all access points having a default SSID. Access points are often set to a default SSID when they are first purchased. For example, most Linksys access points are set to the network name of linksys, most early Cisco access points had a default SSID of tsunami, most Netgear access points are set to netgear, and so on. These default SSIDs are widely documented on the Internet and are well known by any cracker. The fact that the SSID is still set to the default is often a glaring banner to the attacker that reads, “Please attack me as I am still configured to all default settings!”  While it may not be true that “all” settings are still at their defaults, let’s just say there is a very good chance.

When access points are first installed, the SSID should be changed to something cryptic and not something that could be used to determine the company to whom the access point belongs. This recommendation assumes that other companies may be nearby. If no other companies are nearby, the attacker can assume that any visible SSID with a good signal strength is the local company’s network. Changing the SSID to something meaningful such as a department name can provide an intruder valuable information. For example, if a wireless network is installed for the accounting department, and you set the SSID to accounting, any intruder will know there could be financial information on the network to which the access point is attached.  However, with all that being said, proper security makes it all a moot point – and you should have proper security (WPA2-Personal or WPA2-Enterprise these days).

Some wireless security professionals will suggest that you set the SSID according to strong password principles. I disagree with this suggestion as it implies that the SSID somehow affords security itself. While you can give away too much information about the intent of the network with the SSID name (such as in the accounting department example in the preceding paragraph), you cannot really ensure security through what you might call a cryptic SSID or a strong SSID. Skilled attackers can find and access a wireless network that has no security other than a cryptic SSID very easily. Ultimately, I suggest you use the SSID for its intended purpose: to differentiate between networks and not to provide security.

By default, an access point broadcasts the SSID several times per second in beacons (10 times for most standards-based implementations). By listening for these beacons, intruders are provided the opportunity to gather the SSIDs of any access point within range. “Closing the system” by not broadcasting SSIDs in beacons prevents intruders from passively locating the network. Closed system features are not part of the 802.11 series of standards and they are not supported on all access points. When SSIDs are not broadcast, operating systems like Windows XP do not automatically discover the SSID. This configuration causes a potential intruder to put forth a little more effort to gain access to the network—something an intruder may not be willing to do. Unless your organization is protecting something that a cracker knows is valuable, most crackers will attack the “low hanging fruit” first, meaning that any networks that are broadcasting an SSID will be the first targets for intrusion.

However, even when SSID broadcasting is disabled, the SSID can be discovered using utilities that perform active scanning (sending probe request frames) or wireless packet analyzers (which hear all frames types). Sometimes disabling SSID broadcasting may go against business goals, such as with public wireless networks. These networks must be open to allow customers to easily access network resources (usually Internet access). In the end, again, use the SSID for network differentiation and not for security.

Now, to be clear, you can certainly have different security settings associated with different SSIDs, but this is not the same thing as saying that SSIDs give you security. They do not. Can we rid ourselves of this thinking once and for all? I hope so.

SUMMARY: Use the SSID attribute to provide organizational structure to your wireless network and as an indicator to your users as to what network they are accessing. Do not use it as a security solution.

Three Reasons Why My Surface Pro Is A Beast Compared To Your Non-Windows Tablet

1) Running Windows Apps
…and I mean all Windows Apps. I can run a Windows XP VM, using VMware Player or other tools, and then run most any application I desire – even those not directly compatible with Windows 8. Yes, it is a bit clunky sometimes trying to “click” in the right place with my fat finger, but pulling out the pen typically resolves this issue. The point is that I can run very important software apps for an IT geek like me, such as protocol analyzers, spectrum analyzers and programming tools and I can run them all in their full-blown power – not in some limited, nearly useless, tablet version.

 
2) It’s A Computer
…a real computer. Running with 4 GB RAM and a lickety-split fast processor, I can do anything other basic laptops can do. With a small USB 3 hub, I can connect multiple USB devices at the same time. The Surface Pro, and its sister Windows 8 Pro tablets now coming out, is the only tablet that can “really” be used as a tablet and then as a desktop computer. When I go into my office, I can plug it into a USB cable (attached to a powered hub) and have full access to external storage, keyboard and mouse. Then I plug in the video cable and I have a large screen monitor. The performance is as good as my 2 year old desktop sitting across the room.

 
3) It’s A Tablet
…in spite of what many have said (mostly those who have not used it), the Surface Pro is a tablet. Granted, it’s a bit heavier than an iPad, but, then again, it can do a few thousand things the iPad can never do (because of its limited interface options and applications – that’s right, I just said the iPad has limited applications over the Surface Pro because it cannot run all of the Windows apps released over the past decade or more [see reason number 1]). The touch sensitivity is equal to my iPad and my best Android-based devices. No problems there.  The pen is very accurate and makes for excellent diagramming – far superior to that available on either the iPad or the Android-based tablets.

 
As a side note – I have used iDevices off and on for more than three years and Android-based devices during that time – I have lots of experience with all three device types. I have waited a couple of months to write this post because I was initially blown away by the Surface Pro and I thought, “surely this is going to wear off and I will see the flaws in this device that make it less appealing than the Apple or Android devices.” Based on the reviews I had seen to that point, I thought I must be confused about how great it is. Now, after more than two months of use, I am more convinced than ever that, for an IT geek, the other tablets can’t even come close (though this may not be true for the general user). Going back and exploring those reviews again, it became obvious to me that most negative reviews fell into one of the following two categories:

  • Reviews by people who had not used the Surface Pro but commentated only on its features.
  • Reviews by people who had used Apple devices for nearly all their work (laptops and tablets) for several years.

Certainly, people in the first category, should not be taken seriously. People in the second category should be taken very seriously because they do present an issue for Microsoft. Microsoft has to address the learning curve for that group (and it includes many, many younger buyers today). But I don’t work for Microsoft marketing, so that’s their problem and this adaptivity is not in any way a reflection of usefulness or value for those who are willing to adapt. Stated another way, if a device is harder to use for someone who has been using another device, this is not an important  factor in the measurement of either the usability or the functional usefulness of that device. It is simply proof that they know how to use the other device better. Simple as that. From a functional perspective, no one can argue with sincerity that the iPad or Android tablets offer more than the Surface Pro (with the possible exception of access to memory cards, but that is easily solved with a USB memory card reader – though it is, admittedly, not a pretty solution).

 
The reality is that I could go on with another thirty reasons that the Surface Pro is far better for the average IT geek than the other non-Windows tablets, but I simply lack the energy to persuade you. My goal is not really to persuade anyone anyway – just to be a voice that is not influenced by the anti-Microsoft bias that is so common out there. Here’s the way I would summarize it. Do you want a device that can do all the following in equal capability to a laptop while being a true tablet?

  • Run advanced IT software
  • Access custom USB hardware
  • Run virtual machines
  • Run Office – real Office or Office-like applications with all capabilities
  • Access hundreds of thousands (millions ?) of full-featured applications
  • Current access to tens of thousands of custom Windows 8 UI apps (with a growth rate surpassing 100,000 by the end of summer) – think of these as the “tablet” apps for Windows 8
  • The best Internet browsing experience of any tablet (remember, you can install Firefox or Chrome on here – and I mean the real ones, not the lame tablet releases [smile])

Then Surface Pro (or one of its sister Windows 8 tablets coming out from other vendors) is right for you. Certainly, it’s not for everyone, but I cannot even fathom thinking the competing OS-based tablets are better tablet tools for the standard IT pro. However, many will disagree with me and just keep complaining to software vendors about the fact that their needed IT tools are just not available for the iPad that they use.

 

Just sayin’

New Group Policy Settings in Windows 8 and Windows Server 2012

With each new edition of Windows, Microsoft adds new Group Policy capabilities. Group Policy has been with us since the release of Windows 2000, but has roots going back to Windows 95 and Windows NT in the older System Policies. Group Policy allows you to centrally configure settings for Windows clients and servers using Active Directory for deployment and application of these settings.

Windows 8 and Windows Server 2012 introduce new Group Policy settings that may be important to network and system admins. Most of the new settings are related to new features, but many of them are related to existing features from previous editions of Windows as well. In total, Windows Server 2012 not supports more than 3,400 policy settings. Some apply only to older versions and some apply only to newer versions, but with this many policy settings you clearly have a lot of flexibility in centralized management and configuration of the Windows OS.

169 new policies have been created that require Windows 8 or Windows Server 2012 to function. This does not include the policies that require Internet Explorer 10, which typically means you’re running Windows 8 or Windows Server 2012 as well.

Examples of important policies for Enterprise deployments include:

  • Turn off the Store application (can be applied to users or computers)
  • Turn off tile notifications (the rectangles on the Start screen)
  • Turn off toast notifications (the popup notifications in the upper right corner)
  • Location where all default Library definition files for users/machines reside (allows you to point to a single location for consistent Libraries on all computers)
  • Prevent users from uninstalling applications from Start (normally, you can right-click a tile and simply click uninstall – not good in the Enterprise!)
  • Turn off app notifications on the lock screen (may be required for privacy or to reduce network bandwidth consumption)

In addition to the Windows 8 and Windows Server 2012 policies, another 69 require Internet Explorer 10 or above. You can learn more about all the policies (old and new) by downloading the Group Policy Settings Reference for Windows and Windows Server located here.

 

Zig Ziglar – You Will Be Missed

I know, I’m an IT guy. Why am I talking about a sales trainer and author from the 1970s? Because Zig Ziglar became much more than a sales trainer throughout his grand career. He was a trainer, motivator, leader and mentor to so many including me. On the morning of November 28, 2012, Zig passed at the age of 86.

Many people have influenced my life over the years. In the tech sector, Mark Minasi responded to my emails in the 1990s (I was shocked) and shaped my perception of what an author should be like and how an IT industry expert should interact with his or her customers. In the religious world, Jesus (without comparison) has impacted me more than any man and my Pastor, Richard Collins, has had a profound impact on my life.

From a business perspective, no one has probably impacted me more than Zig Ziglar. Sadly, I did not have the chance to meet him, but his audio programs and philosophies have strengthened me through tough times over the past 20 years. I would describe him like this, “Zig Ziglar was a leader and not someone who talked about leadership.” Why? He wasn’t afraid to risk everything to stand up for something. That’s a leader. Someone who tells you how to lead, but doesn’t stand up for something, is really no leader at all. Zig was a leader.

Yes, he is gone, but his legacy remains. It remains in me. It remains in thousands. It remains in books, audio and video recordings that will live on. It remains, because it had an impact. It remains, because it came from passion. It remains, because it should.

Zig, I truly do hope to see you, at the top!

Tom