Archive for the ‘ Tech ’ Category


It’s not often that a great product becomes even greater …

The Raspberry Pi 2 Model B, available from Element 14, was recently released and it’s a serious step up from its predecessors. Before we dive in to what makes it an outstanding product, the Raspberry Pi family tree going from oldest to newest, is as follows:

Raspberry Pi B
Raspberry Pi A
Raspberry Pi B+
Raspberry Pi A+
Raspberry Pi 2 Model B

The + models were upgrades of the previous board versions and the RPi2B is the Raspberry Pi B+’s direct descendent with added muscle. So, what makes the Raspberry Pi 2 Model B great?

The Raspberry Pi 2 Model B has a 40 pin GPIO header as did the A+ and B+ and the first 26 pins are identical to the A and B models making the new board a drop-in upgrade for most projects. The new board also supports all of the expansion (HAT) boards used by the previous models.
The Raspberry Pi 2 Model B has an identical board layout and footprint as the B+, so all cases and 3rd party add-on boards designed for the B+ will be fully compatible.
In common with the B+ the Raspberry Pi 2 Model B has 4 USB 2.0 ports (compared to 2 USB ports on the A, A+, and B models) that can provide up to 1.2 Amps for the more power hungry USB devices (this feature does, however, require a 2 Amp power supply).

The Raspberry Pi 2 Model B video output is via a full-sized HDMI (rev 1.3 & 1.4) port with 14 HDMI resolutions from 640×350 to 1920×1200 with digital audio (there’s also composite video output; see below).

The A, A+, and B models use linear power regulators while the B+ and the Raspberry Pi 2 Model B have switching regulators which reduce power consumption by between 0.5W and 1W.
In common with the B+, the Raspberry Pi 2 Model B’s audio circuit has a dedicated low-noise power supply for better audio quality and analog stereo audio is output on the four pole 3.5mm jack it shares with composite video (PAL and NTSC) output.

The previous top of the line B+ model had 512MB of RAM while the new Raspberry Pi 2 Model B now has 1GB making it possible to run larger applications and more complex operating system environments.

The previous Raspberry Pi models used a 700 MHz single-core ARM1176JZF-S processor while the Raspberry Pi 2 Model B has upped the ante to a 900 MHz quad-core ARM Cortex-A7, a considerably faster CPU. The result is performance that’s roughly 6 times better! The advantages of upgrading existing projects to the Raspberry Pi 2 Model B are huge.

Not only will the Raspberry Pi 2 Model B run all of the operating systems its predecessors ran, it will also be able to run Microsoft’s Windows 10 … for free! Yep, Microsoft has decided that it wants to be part of the Raspberry Pi world and for a good reason; a huge number of kids will have their first experience of computing on RPi boards and what better way to gain new acolytes?

This may be the best improvement of the lot: For the added compute power, increased RAM, and drop-in compatibility there’s no extra cost! The Raspberry Pi 2 Model B is priced at $35, the same as its predecessor!

The Raspberry Pi 2 Model B is one of the best (quite possibly, *the* best) single board computers available and, given the huge popularity of the Raspberry Pi family (now with more than 500,000 Raspberry Pi 2 Model B’s sold and around 5 million Pi’s in total if you include all models), it’s one of the best understood and supported products of its kind. Whether it’s for hobbyist, educational, or commercial use, the Raspberry Pi 2 Model B is an outstanding product.


 

 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

AIIM group finds Microsoft’s Yammer social tool slow to catch on as well, though IT shops hopeful about product roadmap

Many SharePoint installations at enterprises have been doomed largely due to senior management failing to really get behind the Microsoft collaboration technology, according to a new study by AIIM, which bills itself as “the Global Community of IT Professionals.”

The AIIM (Association for Information and Image Management) Web-based survey of 409 member organizations found that nearly two-thirds described their SharePoint projects as either stalled (26%) or not meeting original expectations (37%).

RELATED: 12 Key Strategies for Unlocking the Secrets of SharePoint User Adoption
The associated Yammer social business tool has also been slow to catch on, with only about 1 in 5 organizations using it, and only 10% of them using it regularly and on a widespread basis (Disclosure: I use it a bit here and there at IDG Enterprise!). Many organizations aren’t specifically biased against Yammer though — 4 in 10 say they don’t use any such tool.
Microsoft yammer iPad app Microsoft

Reasons cited for tepid uptake of SharePoint and Yammer include inadequate user training and investment.

“Enterprises have it, but workers are simply not engaging with SharePoint in a committed way,” said Doug Miles, AIIM director of market intelligence, in a statement. “It remains an investment priority however, and the C-suite must get behind it more fully than they are currently if they are to realize a return on that investment.”

Miles says it shouldn’t be up to IT departments to push SharePoint within organizations, but rather, business lines should take the lead.

The study showed that 75% of respondents still feel strongly about making SharePoint work at their organizations. The cloud-based Office 365 version has shown good signs of life, and 43% of respondents indicated faith in Microsoft’s product roadmap for its collaboration tools, according to the AIIM report.

Half of respondents expressed concern about a lack of focus by Microsoft on the on-premise version of SharePoint. That’s an issue that market watcher Gartner stressed last year could make SharePoint a lot less useful for organizations counting on it for customer-facing and content marketing applications.

You can get a free full version of the AIIM study, ‘Connecting and Optimizing SharePoint’, by filling out a registration form.

The research was underwritten in part by ASG, AvePoint, Colligo, Concept Searching, Collabware, EMC, Gimmal Group, K2 and OpenText. While Microsoft is a member of AIIM’s Executive Leadership Council, it is not listed as one of the funders for this study.

A Microsoft representative is looking into our request for comment on the report.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

The best office apps for Android

Written by admin
January 19th, 2015

Which office package provides the best productivity experience on Android? We put the leading contenders to the test

Getting serious about mobile productivity
We live in an increasingly mobile world — and while many of us spend our days working on traditional desktops or laptops, we also frequently find ourselves on the road and relying on tablets or smartphones to stay connected and get work done.

Where do you turn when it’s time for serious productivity on an Android device? The Google Play Store boasts several popular office suite options; at a glance, they all look fairly comparable. But don’t be fooled: All Android office apps are not created equal.

I spent some time testing the five most noteworthy Android office suites to see where they shine and where they fall short. I looked at how each app handles word processing, spreadsheet editing, and presentation editing — both in terms of the features each app offers and regarding user interface and experience. I took both tablet and smartphone performance into consideration.

Click through for a detailed analysis; by the time you’re done, you’ll have a crystal-clear idea of which Android office suite is right for you.

Best Android word processor: OfficeSuite 8 Premium
Mobile Systems’ OfficeSuite 8 Premium offers desktop-class word processing that no competitor comes close to matching. The UI is clean, easy to use, and intelligently designed to expand to a tablet-optimized setup. Its robust set of editing tools is organized into easily accessible on-screen tabs on a tablet (and condensed into drop-down menus on a phone). OfficeSuite 8 Premium provides practically everything you need, from basic formatting to advanced table creation and manipulation utilities. You can insert images, shapes, and freehand drawings; add and view comments; track, accept, and reject changes; spell-check; and calculate word counts. There’s even a native PDF markup utility, PDF export, and the ability to print to a cloud-connected printer.

OfficeSuite 8 Premium works with locally stored Word-formatted files and connects directly to cloud accounts, enabling you to view and edit documents without having to download or manually sync your work.

Purchasing OfficeSuite 8 Premium is another matter. Search the Play Store, and you’ll find three offerings from Mobile Systems: a free app, OfficeSuite 8 + PDF Converter; a $14.99 app, OfficeSuite 8 Pro + PDF; and another free app, OfficeSuite 8 Pro (Trial). The company also offers a dizzying array of add-ons that range in price from free to $20.

The version reviewed here — and the one most business users will want — is accessible only by downloading the free OfficeSuite 8 + PDF Converter app and following the link on the app’s main screen to upgrade to Premium, which requires a one-time $19.99 in-app purchase that unlocks all possible options, giving you the most fully featured setup, no further purchases required.

App: OfficeSuite 8 Premium
Price: $19.99 (via in-app upgrade)
Developer: Mobile Systems

Runner-up Android word processor: Google Docs
Google’s mobile editing suite has come a long way, thanks largely to its integration of Quickoffice, which Google acquired in 2012. With the help of Quickoffice technology, the Google Docs word processor has matured into a usable tool for folks with basic editing needs.

Docs is nowhere near as robust as OfficeSuite 8 Premium, but if you rely mainly on Google’s cloud storage or want to do simple on-the-go writing or editing, it’s light, free, and decent enough to get the job done, whether you’re targeting locally stored files saved in standard Word formats or files stored within Docs in Google’s proprietary format.

Docs’ clean, minimalist interface follows Google’s Material Design motif, making it pleasant to use. It offers basic formatting (fonts, lists, alignment) and tools for inserting and manipulating images and tables. The app’s spell-check function is limited to identifying misspelled words by underlining them within the text; there’s no way to perform a manual search or to receive proper spelling suggestions.

Google Docs’ greatest strength is in its cross-device synchronization and collaboration potential: With cloud-based documents, the app syncs changes instantly and automatically as you work. You can work on a document simultaneously from your phone, tablet, or computer, and the edits and additions show up simultaneously on all devices. You can also invite other users into the real-time editing process and keep in contact with them via in-document commenting.

App: Google Docs
Price: Free
Developer: Google

The rest of the Android word processors
Infraware’s Polaris Office is a decent word processor held back by pesky UI quirks and an off-putting sales approach. The app was clearly created for smartphones; as a result, it delivers a subpar tablet experience with basic commands tucked away and features like table creation stuffed into short windows that require awkward scrolling to see all the content. Polaris also requires you to create an account before using the app and pushes its $40-a-year membership fee to gain access to a few extras and the company’s superfluous cloud storage service.

Kingsoft’s free WPS Mobile Office (formerly Kingsoft Office) has a decent UI but is slow to open files and makes it difficult to find documents stored on your device. I also found it somewhat buggy and inconsistent: When attempting to edit existing Word (.docx) documents, for instance, I often couldn’t get the virtual keyboard to load, rendering the app useless. (I experienced this on multiple devices, so it wasn’t specific to any one phone or tablet.)

DataViz’s Docs to Go (formerly Documents to Go) has a dated, inefficient UI, with basic commands buried behind layers of pop-up menus and a design reminiscent of Android’s 2010 Gingerbread era. While it offers a reasonable set of features, it lacks functionality like image insertion and spell check; also, it’s difficult to find and open locally stored documents. It also requires a $14.99 Premium Key to remove ads peppered throughout the program and to gain access to any cloud storage capabilities.

Best Android spreadsheet editor: OfficeSuite 8 Premium
With its outstanding user interface and comprehensive range of features, OfficeSuite 8 Premium stands out above the rest in the realm of spreadsheets. Like its word processor, the app’s spreadsheet editor is clean, easy to use, and fully adaptive to the tablet form.

It’s fully featured, too, with all the mathematical functions you’d expect organized into intuitive categories and easily accessible via a prominent dedicated on-screen button. Other commands are broken down into standard top-of-screen tabs on a tablet or are condensed into a drop-down menu on a smartphone.

With advanced formatting options to multiple sheet support, wireless printing, and PDF exporting, there’s little lacking in this well-rounded setup. And as mentioned above, OfficeSuite offers a large list of cloud storage options that you can connect with to keep your work synced across multiple devices.

App: OfficeSuite 8 Premium
Price: $19.99 (via in-app upgrade)
Developer: Mobile Systems

Runner-up Android spreadsheet editor: Polaris Office
Polaris Office still suffers from a subpar, non-tablet-optimized UI, but after OfficeSuite Premium 8, it’s the next best option.

Design aside, the Polaris Office spreadsheet editor offers a commendable set of features, including support for multiple sheets and easy access to a full array of mathematical functions. The touch targets are bewilderingly small, which is frustrating for a device that’s controlled by fingers, but most options you’d want are all there, even if not ideally presented or easily accessible.

Be warned that the editor has a quirk: You sometimes have to switch from “view” mode to “edit” mode before you can make changes to a sheet — not entirely apparent when you first open a file. Be ready to be annoyed by the required account creation and subsequent attempts to get you to sign up for an unnecessary paid annual subscription.

Quite honestly, the free version of OfficeSuite would be a preferable alternative for most users; despite its feature limitations compared to the app’s Premium configuration, it still provides a better overall experience than Polaris or any of its competitors. If that doesn’t fit the bill for you, Polaris Office is a distant second that might do the trick.

App: Polaris Office
Price: Free (with optional annual subscription)
Developer: Infraware

The rest of the Android spreadsheet editors
Google Sheets (part of the Google Docs package) lacks too many features to be usable for anything beyond the most basic viewing or tweaking of a simple spreadsheet. The app has a Function command for standard calculations, but it’s hidden and appears in the lower-right corner of the screen inconsistently, rendering it useless most of the time. You can’t sort cells or insert images, and its editing interface adapts poorly to tablets. Its only saving grace is integrated cloud syncing and multiuser/multidevice collaboration.

WPS Mobile Office is similarly mediocre: It’s slow to open files, and its Function command — a vital component of spreadsheet work — is hidden in the middle of an “Insert” menu. On the plus side, it has an impressive range of features and doesn’t seem to suffer from the keyboard bug present in its word-processing counterpart.

Docs to Go is barely in the race. Its embarrassingly dated UI makes no attempt to take advantage of the tablet form. Every command is buried behind multiple layers of pop-up menus, all of which are accessible only via an awkward hamburger icon at the top-right of the screen. The app’s Function command doesn’t even offer descriptions of what the options do — only Excel-style lingo like “ABS,” “ACOS,” and “COUNTIF.” During my testing, the app failed to open some perfectly valid Excel (.xlsx) files I used across all the programs as samples.

Best Android presentation editor: OfficeSuite 8 Premium
OfficeSuite 8 Premium’s intuitive, tablet-optimized UI makes it easy to edit and create presentations on the go. Yet again, it’s the best-in-class contender by a long shot. (Are you starting to sense a pattern here?)

OfficeSuite offers loads of options for making slides look professional, including a variety of templates and a huge selection of slick transitions. It has tools for inserting images, text boxes, shapes, and freehand drawings into your slides, and it supports presenter notes and offers utilities for quickly duplicating or reordering slides. You can export to PDF and print to a cloud-connected printer easily.

If you’re serious about mobile presentation editing, OfficeSuite 8 Premium is the only app you should even consider.

App: OfficeSuite 8 Premium
Price: $19.99 (via in-app upgrade)
Developer: Mobile Systems

Runner-up Android presentation editor: Polaris Office
If it weren’t for the existence of OfficeSuite, Polaris’s presentation editor would look pretty good. The app offers basic templates to get your slides started; they’re far less polished and professional-looking than OfficeSuite’s, but they get the job done.

Refreshingly, the app makes an effort to take advantage of the tablet form in this domain, providing a split view with a rundown of your slides on the left and the current slide in a large panel alongside it. (On a phone, that rundown panel moves to the bottom of the screen and becomes collapsible.)

With Polaris, you can insert images, shapes, tablets, charts, symbols, and text boxes into slides, and drag-and-drop to reorder any slides you’ve created. It offers no way to duplicate an existing slide, however, nor does it sport any transitions to give your presentation pizazz. It also lacks presenter notes.

Most people would get a better overall experience from even the free version of OfficeSuite, but if you want a second option, Polaris is the one.

App: Polaris Office
Price: Free (with optional annual subscription)
Developer: Infraware

The rest of the Android presentation editors
Google Slides (part of the Google Docs package) is bare-bones: You can do basic text editing and formatting, and that’s about it. The app does offer predefined arrangements for text box placement — and includes the ability to view and edit presenter notes — but with no ability to insert images or slide backgrounds and no templates or transitions, it’s impossible to create a presentation that looks like it came from this decade.

WPS Mobile Office is similarly basic, though with a few extra flourishes: The app allows you to insert images, shapes, tables, and charts in addition to plain ol’ text. Like Google Slides, it lacks templates, transitions, and any other advanced tools and isn’t going to create anything that looks polished or professional.

Last but not least, Docs to Go — as you’re probably expecting by this point — borders on unusable. The app’s UI is dated and clunky, and the editor offers practically no tools for modern presentation creation. You can’t insert images or transitions; even basic formatting tools are sparse. Don’t waste your time looking at this app.

Putting it all together
The results are clear: OfficeSuite 8 Premium is by far the best overall office suite on Android today. From its excellent UI to its commendable feature set, the app is in a league of its own. At $19.99, the full version isn’t cheap, but you get what you pay for, which is the best mobile office experience with next to no compromises. The less fully featured OfficeSuite 8 Pro ($9.99) is a worthy one-step-down alternative, as is the basic, ad-supported free version of the main OfficeSuite app.

If basic on-the-go word processing is all you require — and you work primarily with Google services — Google’s free Google Docs may be good enough. The spreadsheet and presentation editors are far less functional, but depending on your needs, they might suffice.

Polaris Office is adequate but unremarkable. The basic program is free, so if you want more functionality than Google’s suite but don’t want to pay for OfficeSuite — or use OfficeSuite’s lower-priced or free offerings — it could be worth considering. But you’ll get a significantly less powerful program and less pleasant overall user experience than what OfficeSuite provides.

WPS Mobile Office is a small but significant step behind, while Docs to Go is far too flawed to be taken seriously as a viable option.

With that, you’re officially armed with all the necessary knowledge to make your decision. Grab the mobile office suite that best suits your needs — and be productive wherever you may go.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Coming soon: Better geolocation Web data

Written by admin
January 8th, 2015

The W3C and OGC pledge to ease the path for developing location-enriched Web data

From ordering pizza online to pinpointing the exact location of a breaking news story, an overwhelming portion of data on the Web has geographic elements. Yet for Web developers, wrangling the most value from geospatial information remains an arduous task.

Now the standards body for the Web has partnered with the standards body for geographic information systems (GIS) to help make better use of the Web for sharing geospatial data.

Both the World Wide Web Consortium (W3C) and the Open Geospatial Consortium (OGC) have launched working groups devoted to the task. They are pledging to closely coordinate their activities and publish joint recommendations.

Adding geographic elements to data online in a meaningful way “can be done now, but it is difficult to link the two worlds together and to use the infrastructure of the Web effectively alongside the infrastructure of geospatial systems,” said Phil Archer, who is acting as data activity lead for the W3C working group.

A lack of standards is not the problem. “The problem is that there are too many,” he said. With this in mind, the two standards groups are developing a set of recommendations for how to best use existing standards together.

As much as 80 percent of data has some geospatial element to it, IT research firm Gartner has estimated. In the U.S. alone, geospatial services generate approximately $75 billion a year in annual revenue, according to the Boston Consulting Group.

Making use of geospatial data still can be a complex task for the programmer, however. An untold amount of developer time is frittered away trying to understand multiple formats and sussing out the best ways to bridge them together.

For GIS (geographic information system) software, the fundamental units of geospatial surface measurement are the point, line and polygon. Yet, people who want to use geographically enhanced data tend to think about locations in a fuzzier manner.

For instance, say someone wants to find a restaurant in the “Little Italy” section of a city, Archer explained. Because such neighborhoods are informally defined, they don’t have a specific grid of coordinates that could help in generating a definitive set of restaurants in that area.

“That sort of information is hard to get if you don’t have geospatial information and it is also hard to get if you only have geospatial information,” Archer said.

Much of the work the groups will do will be centered around bridging geolocational and non-geolocational data in better ways — work that the two groups agreed needed to be completed at a joint meeting last March in London.

The groups will build on previous research done in the realm of linked open data, an approach of formatting disparate sources of data so they can be easily interlinked.

The groups will also look at ways to better harness emerging standards, notably the W3C’s Semantic Sensor Network ontology and OGC’s GeoSPARQL.

The working groups plan to define their requirements within the next few months, and will issue best practices documents as early as by the end of the year.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

16 of the hottest IT skills for 2015

Written by admin
January 4th, 2015

2015 will bring new opportunities for professional growth and development, not to mention more money. But what specific skills will add the most value for your career advancement in the new year.

The Hottest IT and Tech Skills for 2015
What skills should IT professionals add to their toolbox to increase their compensation in 2015? To find out, CIO.com worked with David Foote, chief analyst and research officer with Foote Partners, to comb through the firm’s quarterly data to uncover what skills will lead to higher pay in the short term and help them navigate the tech industry for next career move in the long term.

Foote Partners uses a proprietary methodology to track and validate compensation data for tech workers. It collects data on 734 individual certified and noncertified IT skills. Of those skills, 384 are of the noncertified variety and the focus on this report.

Cloud Skills
Cloud adoption continues to accelerate as organizations large and small try to capitalize on cloud computing’s cost benefits. In fact, it’s become a mainstream in IT organizations. Cloud adoption among IT departments everywhere is somewhere near 90 percent for 2014. “Companies began discovering the cloud about four years ago and it’s been quite volatile in the last year. Will companies continue to invest in the cloud? The answer is ‘yes,’ ” according to Foote.

Although Foote Partners has found a 3 percent to a 3.5 percent drop in market value, Foote notes it’s an area with some unpredictability but it’s cyclical. “It’s a volatile marketplace when it comes to talent,” he says.

Architecture
Foote points out that as organizational complexity is increasing, businesses are becoming more aware of the value of a great architect and these roles are showing up with more frequency among his clients. The Open Group Architecture Framework (TOGAF) skills, in particular, are the most highly paid noncertified IT skill and a regular on the hot skills lists.

“We know a lot of companies are getting into architecture in a bigger way. They’re hiring more architects; they’re restructuring their enterprise architect departments. Their starting to see a lot of value and no one is really debating that you can never have too many talented architects in your business. This is not something you can ignore. Everyone is thinking that no matter what we do today, we have to always be thinking down the road — three years, five years or more. The people that do that for a living are architects,” says Foote.

Database/Big Data Skills
Big data is attractive to organizations for a number of reasons. Unfortunately, many of those reasons haven’t panned out. According to Foote, companies got caught up in the buzz and now they are taking a more conservative approach. That said, this is an area that Foote Partners expects to grow in 2015. Adding any of these skills to your skillset will make you more valuable to any employer looking to capitalize on the promise of big data.

Although it just missed their highest paying noncertified IT skills list, pay for data sciences skills are expected to increase into 2015. “This group [of skills] is in transition. There is still a big buzz factor around data sciences which will result in companies paying more for this skill, “says Foote.

Data management will increasingly be important as companies try to wrangle actionable data from their many disparate sources of data.

Applications Development Skills
Applications development is undoubtedly a hot skills area. Demand for both mobile and desktop developers continues to increase and this trend will continue well into 2015. However, Foote Partners data suggests that the three skills listed here are poised for significant growth in the coming year. It’s worth noting that JavaFX and user interface/experience design skills also made Foote Partners list of highest paying noncertified IT skills.

Organizations are more regularly refining their digital customer experience, making user interface and experience design crucial skills in the coming year.

JavaFX is coming on strong as it replaces Swing in the marketplace.

Agile programming is new to the noncertified IT skills list, but Foote predicts pay premium for this area to grow into 2015.

SAP and Enterprise Business Applications Skills
SAP is a global organization related to ERP applications ranging from business operations to CRM. Foote partners tracks nearly 93 SAP modules and have noticed a lot of fluctuation in value over the last year among these modules. However, according to Foote Partners data, SAP CO-PA, SAP FI-FSCM, SAP GTS and SAP SEM are all expected to be hot in 2015.

Security Skills
Security has come to the forefront in 2014 with organizations large and small being targeted by cybercriminals. The list of businesses attacked is long but includes some heavyweights like Sony, eBay and Target to name a few. Foote points out that cybersecurity is now part of today’s lexicon to both techies and consumers alike.

“Security is blown wide open. Cybersecurity has now become an issue that everyone sees as important. Inside cybersecurity skills and certifications there is a lot of activity. It’s gone mainstream. I think you’re going to see cybersecurity on this list for some time to come,” says Foote.

Management, Process and Methodology Skills
Project and program management are new to the list, but Foote Partners predict this area to be in high demand in 2015.

Foote emphasizes that fluctuations in pay premiums don’t tell the whole story. They also apply what they have learned from the data provided from the 2,648 employers that they work with. That’s why you may have noticed that some skills covered appear flat. Some of these make the list of hot skills because Foote Partners has uncovered some data or trend that will likely drive up pay in these areas in 2015.

“There is more than recent pay premium track record considered in our forecast list. We talk to a lot of people in the field making decisions about skills acquisition at their companies. We look at tech evolution and where we think skills consumption is heading and so forth,” says Foote.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

The greatest tech wins and epic comebacks of 2014

Written by admin
January 1st, 2015

From gigantic smartphones to virtual reality, here are the products, companies and ideas that emerged victorious in the tech world this year.

Refinement, not revolution
While 2014 didn’t bring much in the way of revolutionary technology, it was a great year for refinement. The products and services we’ve relied on for years became cheaper and more accessible, while once-difficult concepts like virtual reality and mobile wallets starte to look a little more practical. And if you look hard enough, you can even find some examples where the government didn’t screw everything up.

Here are the top 10 products, companies and ideas that emerged victorious in the tech world this year.

Microsoft’s new moves
Whether you loved or loathed Steve Ballmer, you’ve got to admit Microsoft has become a more exciting company since his departure. Under new CEO Satya Nadella, Microsoft has slain the sacred cows of Windows and Office, offering free versions of both on tablets and other mobile devices. We’ve seen Microsoft show a deep appreciation for other platforms as well, with new apps and integrations on Android and iPhone. The message? If you haven’t been paying attention to Microsoft lately, you might want to reconsider.

Apple Pay makes the mobile wallet work
Mobile payments had plenty of naysayers before the arrival of Apple Pay, as they wondered how paying at the checkout line with a smartphone could ever be easier than pulling out a credit card. Apple’s answer is simple: Pair the iPhone’s TouchID fingerprint reader with NFC, so users can pay without even looking at their phones or turning on the screen. Not only is that more efficient than a credit card, it’s way more secure because it never transmits the actual card number. Older solutions never quite got it right, and that’s why Apple Pay quickly became the mobile payments frontrunner.

PlayStation 4 asserts its dominance
While Microsoft hemmed and hawed over its Xbox strategy, Sony realized early on that it could take control of the console wars with lower pricing and a focus on gaming. That plan paid off this year, as the PlayStation 4 outsold the Xbox One in the United States for 10 months in a row. True, Microsoft had a strong November thanks to significant price drops, but chances are those cuts wouldn’t have happened if Sony hadn’t built up a commanding lead.

Validation for gigantic phones
Samsung was onto something when it launched the Galaxy Note in 2011, even if pundits failed to recognize it at the time. Three years later, even regular-sized phones from Samsung and LG have screens exceeding five inches, and Apple finally saw fit to super-size its iPhone lineup with 4.7-inch and 5.5-inch models. While there’s an argument to be made for smaller screens, the jumbo phone is here to stay.

Net neutrality protesters win this round
FCC Chairman (and former telecom lobbyist) Tom Wheeler probably expected some pushback when he proposed some alarmingly flaccid net neutrality rules earlier this year, but the actual response was overwhelming. The FCC received a record 3 million comments—most of them opposed to Wheeler’s proposal—and last month, President Barack Obama urged Wheeler to create stronger protections by reclassifying broadband as a phone-like utility. Even if the FCC makes a decision in the spring, as many expect, lawsuits could prolong the conflict for years. At least the public can feel good about making their voices heard.

Oculus takes Facebook’s money to make VR huge
Until March of this year, Oculus was chugging along as a grassroots effort, with big ambitions for virtual reality but not enough capital to see them through. That was before Facebook splashed the VR pot with a $2 billion acquisition. The move had plenty of detractors, but Facebook’s money allows Oculus to move faster, create better products, and maybe even finally bring virtual reality to the masses. If Facebook can keep its promises not to meddle too much, it might even be a way to win back some much-needed trust.

Winamp keeps on keeping on
After 15 years of kicking out the jams, Winamp seemed to be at the end of its rope last November. A notification informed users that the once-beloved MP3 player would go offline the following month, kicking off a final wave of nostalgia. But in January, Winamp got a reprieve, with a last-minute acquisition by Internet radio firm Radionomy. Winamp may never return to its glorious past, but at least it still has a future.

Cord-cutting gets real
With more people giving up their cable TV subscriptions or deciding not to have one in the first place, it’s getting harder for the pay TV industry to pretend that cord-cutting isn’t real. This year’s biggest acknowledgment of reality came courtesy of HBO, which now says it will launch a standalone streaming service in 2015. Showtime quickly followed suit. Expect this to become a trend as the expensive, bloated cable bundle reaches its tipping point.

Cloud storage gets dirt-cheap
If you’d written off cloud storage as being too expensive to contain all your precious digital belongings, 2014 has been a good year to reconsider. Microsoft kicked off the cloud storage price wars with 1TB for Office 365 subscribers, and later went fully unlimited. Google followed with reduced pricing for consumers and unlimited storage for enterprise users. And Dropbox, whose price per gigabyte had never been a bargain, upped its $10-per-month service from 100GB to 1TB. Add Amazon’s unlimited photo storage for Prime subscribers to the mix, and you’ve got plenty of cloud storage options on the cheap.

Supreme Court says no to warrantless phone search
The U.S. Supreme Court didn’t get everything right this year (see: Aereo). But at least the Justices had the sense to realize that the contents of your phone are just as personal and private as the belongings in your house. As such, law enforcement can’t search smartphones without a warrant. At a time of rapidly eroding digital privacy, the decision was a much-needed shot of sanity.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

The news follows the decision to release the controversial movie in some theaters on Christmas Day

The controversial movie “The Interview” is now available online through Google and Microsoft services as well as a Sony Pictures website, the companies announced separately on Wednesday.

This development follows Sony’s decision to screen the comedy in select U.S. cities on its original release date of Christmas Day after initially canceling those plans.

A satire that depicts how two U.S. journalists would carry out an assignment to assassinate North Korean leader Kim Jong-un, “The Interview” has been anything but a barrel of laughs for Sony, which produced the film.

Sony suffered a cyberattack that resulted in theft of emails containing sensitive information like actor salaries and plots of upcoming movies. Additionally, threats of violence against theaters that showed the film led Sony to cancel its theatrical release, a decision it later reversed.

Google, which made the movie available to either rent or buy through YouTube and Google Play, weighed the security concerns before agreeing to offer the movie, said David Drummond, Google’s senior vice president of corporate development and chief legal officer, in a blog post.

“Sony and Google agreed that we could not sit on the sidelines and allow a handful of people to determine the limits of free speech in another country (however silly the content might be),” he said, adding that Sony approached the company last week about making the film available online.

People with an Xbox game console, a Windows Phone and PCs and tablets running Windows 8 and 8.1 can either purchase or rent the movie, Microsoft said in a blog post that also touched on themes of freedom to explain its decision to sell the film.

“Our Constitution guarantees for each person the right to decide what books to read, what movies to watch, and even what games to play. In the 21st Century, there is no more important place for that right to be exercised than on the Internet,” wrote Brad Smith, Microsoft’s general counsel and executive vice president, legal and corporate affairs.

Microsoft and Google charge US$5.99 to rent the movie and $14.99 to buy it.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Google Go ventures into Android app development

Written by admin
December 12th, 2014

Google Go 1.4 adds official support for Android, as well as improved syntax and garbage collection

Google’s Go language, which is centered on developer productivity and concurrent programming, can now be officially used for Android application development.

The capability is new in version 1.4, released this week. “The most notable new feature in this release is official support for Android. Using the support in the core and the libraries in the golang.org/x/mobile repository, it is now possible to write simple Android apps using only Go code,” said Andrew Gerrand, lead on the Google Cloud Platform developer relations team, in a blog post. “At this stage, the support libraries are still nascent and under heavy development. Early adopters should expect a bumpy ride, but we welcome the community to get involved.”

Android commonly has leveraged Java programming on the Dalvik VM, with Dalvik replaced by ART (Android Run Time) in the recently released Android 5.0. OS. Open source Go, which features quick compilation to machine code, garbage collection, and concurrency mechanisms, expands options for Android developers. The upgrade can build binaries on ARM processors running Android, release notes state, and build a .so library to be loaded by an Android application using supporting packages in the mobile subrepository.

“Go is about making software simpler,” said Gerrand in an email, “so naturally, application development should be simpler in Go. The Go Android APIs are designed for things like drawing on the screen, producing sounds, and handling touch events, which makes it a great solution for developing simple applications, like games.”

Android could help Go grow, said analyst Stephen O’Grady, of RedMonk: “The Android support is very interesting, as it could eventually benefit the language much the same way Java has from the growth of the mobile platform.”

Beyond the Android capabilities, version 1.4 improves garbage collection and features support for ARM processors on Native Client cross-platform technology, as well as for AMD64 on Plan 9. A fully concurrent collector will come in the next few releases.

Introduced in 2009, the language has been gaining adherents lately. Go 1.3, the predecessor to 1.4, arrived six months ago. Go, O’Grady said, “is growing at a healthy pace. It was just outside our top 20 the last time we ran our rankings [in June], and I would not be surprised to see it in the Top 20 when we run them in January.”

Version 1.4 contains “a small language change, support for more operating systems and processor architectures and improvements to the tool chain and libraries,” Gerrand said. It maintains backward compatibility with previous releases. “Most programs will run about the same speed or slightly faster in 1.4 than in 1.3; some will be slightly slower. There are many changes, making it hard to be precise about what to expect.”

The change to the language is a tweak to the syntax of for-range loops, said Gerrand. “You may now write for range s { to loop over each item from s, without having to assign the value, loop index, or map key.” The go command, meanwhile, has a new subcommand, called go generate, to automate the running of tools generating source code before compilation. The Go project with version 1.4 has been moved from Mercurial to Git for source code control.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Blowing up entrenched business models and picking up the profits that spill onto the floor is a time-honored tradition in tech, these days known by the cliche of the moment, “disruption.” This year everyone was trying to push back against those upstarts, whether by buying them like Facebook did, reorganizing to compete with them like HP and Microsoft have done, or just plain going out against them guns blazing, as it seemed that every city and taxi company did with Uber. European courts fought the disruptive effect Google search has had on our very sense of the historical record. But meanwhile, legions of net neutrality supporters in the US spoke up to save the Internet’s core value of disruption against the oligopoly of a handful of communications carriers. Here are our picks for the top stories of a very, well, disruptive year.

Nadella aims Microsoft toward relevancy in a post-PC world
Taking over from Steve Ballmer in February, CEO Satya Nadella faced several uncomfortable truths, among them: Windows powers only 15 percent of all computing devices worldwide, including smartphones, tablets and PCs, meaning Microsoft is no longer at the center of most people’s computing experience. Nadella says he wants Microsoft to be the productivity and platform company for a “mobile first, cloud first world.” Under Nadella, Microsoft has launched Office for the iPad, embraced open source software for its Azure cloud and launched the beta for Windows 10, which promises to smooth out Windows 8’s confusing, hybrid user interface. Shortly after closing the Nokia acquisition he inherited, Nadella announced 18,000 job cuts, 14 percent of its global staff. The bulk of those cuts are in Nokia, which has been relegated to the “other” market share category in smartphones. Microsoft’s sales looked good last quarter, jumping 25 percent year-over-year to $23.2 billion, though profit was hurt by the Nokia buy. Nadella claimed the company is “innovating faster,” which had better be true if he is to succeed.

HP says breaking up is hard, but necessary
Agility appears to be more important than size these days. In an about-face from the direction CEO Meg Whitman set three years ago, Hewlett-Packard announced in October that it will split up, divorcing its PC and printer operations from its enterprise business. When Whitman took the reins from former HP chief Leo Apotheker in 2011, she renounced his idea to split up the venerable Silicon Valley company, saying PCs were key to long-term relationships with customers. But shedding assets is becoming a common strategy for aging tech giants. IBM has focused on enterprise technology and services after selling first its PC operations years ago, and then its server business this year, to Lenovo, and agreeing in October to pay GlobalFoundries $1.5 billion to take over money-losing chip facilities. Symantec announced this year that it would spin off its software storage business, the bulk of which it acquired 10 years ago from Veritas Software for $13.5 billion. The big question for HP is whether it can avoid alienating users and distracting its hundreds of thousands of employees.

Uber’s bumpy ride shakes up the “sharing” economy
Legal challenges and executives behaving badly marked the ascendancy of Uber this year as much as its explosive growth and sky-high valuation. The startup’s hard-driving, take-no-prisoners culture has made it an unlikely poster child for the innocuous—and perhaps misleadingly labeled—“sharing” economy. Announcing the company’s latest billion-dollar cash injection in December, CEO Travis Kalanick bragged that Uber had launched operations in 190 cities and 29 countries this year. The service is now valued at $40 billion. But the company’s army of private drivers face legal challenges, inquiries and preliminary injunctions against operating, from Germany and the UK to various US states. Executives have made matters worse by threatening to dig up dirt on critical journalists and bragging about a tool called “god view” that lets employees access rider logs without permission. Rival app-based ride services like Lyft and Sidecar, whose operations are also the target of inquiries, are distancing themselves from Uber. Added to all this, there are complaints about the legality of other sorts of so-called sharing services, like apartment-rental site Airbnb, which has spawned not just opportunities for regular folks with an extra room and a hospitable nature, but created a class of real-estate investors who are de facto hoteliers. All this suggests that Web-based companies seeking a “share” of profits using middleman tech platforms to disrupt highly regulated businesses like taxis and lodging have some real battles against entrenched interests still to fight.

Facebook gambles $16 billion on WhatsApp
Established companies are snapping up upstarts at a pace not seen since the dot-com boom days, but in February Facebook’s plan to buy WhatsApp for $16 billion had jaws dropping at the price tag. WhatsApp has hit about a half billion users with its mobile messaging alternative to old-school carriers. Facebook already had a chat feature, as well as a stand-alone mobile app called Messenger. But people don’t use them for quick back and forth conversations, as CEO Mark Zuckerberg has acknowledged. At the Mobile World Congress in Barcelona, he confessed that he could not prove in charts and figures that WhatsApp is worth the money he spent, but said that not many companies in the world have a chance at cracking the billion-user mark, and that in itself is incredibly valuable.

Mt Gox implodes, deflating Bitcoin hype
Last year, Bitcoin seemed poised to disrupt conventional currencies. But this year the high-flying cryptocurrency hit some turbulence. The largest Bitcoin exchange in the world, Tokyo-based Mt Gox, fell to earth amid tears and lawsuits after an apparent hack cost the company about 750,000 bitcoins worth about $474 million. The company said a flaw in the Bitcoin software allowed an unknown party to steal the digital currency. A few weeks later Flexcoin, a smaller site, closed after it got hacked. The closures sent tremors of fear through the fledgling Bitcoin market. The leaders of Coinbase, Kraken, Bitstamp, BTC China, Blockchain and Circle all signed a statement lambasting Mt Gox for its “failings.” But the incidents took the luster off Bitcoin. Still, New York’s proposed Bitcoin regulations may establish a legal framework, and confidence, to help exchanges grow in one of the world’s biggest financial centers. Bitcoin concepts may also spur spinoff technology. A company called Blockstream is pursuing ideas to use Bitcoin’s so-called blockchain, a distributed, public ledger, as the basis for a platform for all sorts of transactional applications.

Apple Pay starts to remake mobile payments
Apple’s ascendance to the world’s most valuable company came on top of market-defining products like the iPod, iTunes, the iPhone and the iPad. This year, it was not the iPhone 6 or the as-yet unreleased Apple Watch that came close to redefining a product category—it was Apple Pay. Apple Pay requires an NFC-enabled Apple device, which means an iPhone 6 or 6 Plus, but by early next year, Apple Watch as well. Businesses need NFC-equipped payment terminals. With Apply Pay, you can make a credit or debit card payment simply by tapping your iPhone to the NFC chip reader embedded in a payment terminal. As you tap, you put your finger on the iPhone 6’s biometric fingerprint reader. Apple was careful to line up partners: while Google stumbled trying to get support for its Wallet, more than 500 banks and all major credit card companies are working with Apple Pay. The potential security benefits top it off: When you enter your credit or debit card number, Apple replaces it with a unique token that it stores encrypted. Your information is never stored on your device or in the cloud.

Alibaba’s IPO marks a new era for Chinese brands
In their first day of trading on the New York Stock Exchange in September, Alibaba shares opened at $92.70, 35 percent over the $68 initial public offering price, raking in $21.8 billion and making it the biggest tech IPO ever. Alibaba is an e-commerce behemoth in China, now looking to expand globally. But don’t expect a direct challenge to Amazon right away. Its strategy for international dominance depends not only on broad e-commerce, but also on carving out different niche marketplaces. Shares three months after its opening are going for about $10 more, suggesting that shareholders have faith in that strategy. The IPO also marked the ascendancy of Chinese brands. After scooping up IBM’s PC business years ago, and this year spending $2.3 billion for IBM’s server business as well as $2.9 billion for Motorola, Lenovo is the world’s number one PC company and number three smartphone company. Meanwhile Xiaomi, the “Apple of China,” has become the world’s number-four smartphone vendor.

Regin and the continuing saga of the surveillance state
Symantec’s shocking report on the Regin malware in November opened the latest chapter in the annals of international espionage. Since at least 2008, Regin has targeted mainly GSM cellular networks to spy on governments, infrastructure operators, research institutions, corporations, and private individuals. It can steal passwords, log keystrokes and read, write, move and copy files. The sophistication of the malware suggests that, like the Stuxnet worm discovered in 2010, it was developed by one or several nation-states, quite possibly the U.S. It has spread to at least 10 countries, mainly Russia and Saudi Arabia, as well as Mexico, Ireland, India, Afghanistan, Iran, Belgium, Austria and Pakistan. If Regin really is at least six years old, it means that sophisticated surveillance tools are able to avoid detection by security products for years, a chilling thought for anyone trying to protect his data.

EU ‘right to be forgotten’ ruling challenges Google to edit history
The EU’s Court of Justice’s so-called right to be forgotten ruling in May means that Google and other search engine companies face the mountainous task of investigating and potentially deleting links to outdated or incorrect information about a person if a complaint is made. The ruling came in response to a complaint lodged by Spanish national insisting that Google delete links to a 1998 newspaper article that contained an announcement for a real-estate auction related to the recovery of social security debts owed by him. The complaint noted the issue had been resolved. But while EU data-privacy officials cheer, free-speech advocates say the ruling’s language means that people can use it to whitewash their history, deleting even factually correct stories from search results. As of mid-November, Google had reviewed about 170,000 requests to delist search results that covered over 580,000 links. The headaches are just starting: Now the EU says the delinking must be applied to all international domains, not just sites within the region.

Obama weighs in as FCC goes back to the drawing boards on net neutrality
In January, a U.S. appeals court struck down the FCC’s 2011 regulations requiring Internet providers to treat all traffic equally. The court said the FCC did not have the authority to enact the rules, challenged in a lawsuit brought by Verizon. The ruling reignited the net neutrality debate, with FCC Chairman Tom Wheeler proposing new rules in April. President Obama in November made his strongest statement on net neutrality to date, urging the FCC to reclassify broadband as a regulated utility, imposing telephone-style regulations. Obama’s move, which critics say is an unprecedented intrusion on an independent government agency, puts political pressure on Wheeler, who reportedly favors a less regulatory approach. The proposal from Wheeler earlier this year stopped short of reclassification, and allowed broadband providers to engage in “commercially reasonable” traffic management. Public comments on Wheeler’s proposal had hit nearly 4 million by September. The ball is now back in Wheeler’s court, as he negotiates a resolution to the whole affair with his fellow commissioners.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Full speed ahead for 802.11ac Gigabit Wi-Fi

Written by admin
December 5th, 2014

802.11n takes a back seat as Wave 1 and 2 802.11ac wireless LAN products drive rollouts

Last December customers were peppering wireless LAN vendors with questions about whether to upgrade to the pre-standard-but-certified 802.11ac products flooding the market or hold off until 2015, when more powerful “Wave 2” Gigabit Wifi gear was expected to become prevalent.

A year later, even though Wave 2 products have begun trickling into the market, many IT shops seem less preoccupied with Wave 2 and more focused on installing the Wave 1 11ac routers, access points and other products at hand. After all, this first wave of 11ac is at least a couple times faster than last generation 11n, plus has more range, boasts better power efficiency and is more secure. And even Apple’s new iPhone 6 and 6 Plus support it.

Surprisingly, 802.11ac products aren’t much more expensive than 11n ones, if at all. That might help explain why market watcher Infonetics reported in September that “802.11ac access point penetration has nearly doubled every quarter and is starting to cannibalize 802.11n.” And the company is optimistic that 11ac and Wave 2 products, plus carrier interest in the technology, will give the WLAN market a boost in 2015.

Ruckus Wireless, which sells WLAN gear to enterprises and carriers, sees customers taking a middle-of-the-road approach, buying some 11ac products now and figuring to buy more when Wave 2 products are plentiful. Ruckus is looking to let customers who do invest in 11ac now upgrade products to Wave 2 at little to no cost down the road.

Aruba Networks, which rolled out 802.11ac access points in May of 2013 to deliver more than 1Gbps throughput, is now shipping more 11ac than 11n gear.

“We’re definitely seeing customers making the shift — almost all of them are either actively looking at ‘ac’ or are starting to think about it in the next year,” says Christian Gilby, director of enterprise product marketing and owner of the @get11ac Twitter handle. “What’s really driving it is the explosion of devices. From a standards point of view, there are [more than 870] devices WiFi Alliance-certified for ‘ac’.”

Many of those devices were certified before the standard was finalized and do not support the performance-enhancing options that so-called Wave 2 products will feature. This includes support for multi-user MIMO, which allows transmission of multiple spatial streams to multiple clients at the same time. It’s seen as being akin to the transition from shared to switched Ethernet.

Wave 2 chipsets and gear have begun trickling out, with Qualcomm being among the latest. But WiFi Alliance certification could still be quite a few months away – maybe even into 2016 — and that could make buyers expecting interoperability hesitate.

The real holdup for Wave 2, though, says Gilby, is that it will require a chipset change

in client devices such as laptops and tablets. “You really need the bulk of the clients to get upgraded before you see the benefits,” he says. (A recently released survey commissioned by network and application monitoring and analysis company WildPackets echoed Gilby’s sentiments and found that 41% of those surveyed said that less than 10% of their organization’s client devices supported 11ac.)
I think we’ll see some enterprise products on the AP side in 2015…in fact, I’m pretty sure we will.”

Christian Gilby, director of enterprise product marketing, Aruba Networks
Gilby adds that while Wave 2 products will support double the wireless channel width, the government will first need to free up more frequencies to exploit this. Customers will also need to make Ethernet switch upgrades on the back-end to handle the higher speeds on the wireless side, and new 2.5Gbps and 5Gbps standards are in the works.

Nevertheless it sounds as though enterprise Wave 2 802.11ac products will start spilling forth next year, with high-density applications expected to be the initial use for them. “There’s been some stuff on the consumer side… I think we’ll see some enterprise products on the AP side in 2015…in fact, I’m pretty sure we will,” said Gilby.
Ruckus ZoneFlex R600 802.11ac access point Ruckus Wireless

Ruckus ZoneFlex R600 802.11ac access point
Ruckus Wireless vows to become one of the first vendors to market with a Wave 2 product in 2015 and has already had success with it in the labs using Qualcomm chips, says VP of Corporate Marketing David Callisch. Though he says vendors will really need to work hard on their antenna structures to make Wave 2 work well. “As the WiFi standards become more complex, having more sophisticated RF control is beneficial, especially when you’re talking about having so many streams and wider channels.” He says that “11ac is where it’s at… Customers need the density. WiFi isn’t about the coverage anymore, it’s about capacity.”

Like Gilby, Callisch says the big hold-up with 11ac Wave 2 advancing is on the client side, where vendors are always looking to squeeze costs. Wave 2 is backwards compatible with existing clients, but still…

“It’s expensive to put ‘ac’ into clients,” he says. “If you adopted Wave 2 products today you really couldn’t get what you need to take full advantage of it. But that will change and pretty quickly.”

RELATED: Just another Wacky Week in Wi-Fi
As for how customers are using 11ac now, Gilby says where they have already installed 11n products on the 5GHz band, they are starting to do AP-for-AP swap-outs. It can be trickier for those looking to move from 2.4GHz 11n set-ups.
Aruba Series 200 802.11ac APs Aruba Networks

Aruba Series 200 802.11ac APs
802.11ac is also catching on among small and midsize organizations, which companies such as Aruba (with its 200 series APs) have started to target more aggressively. Many of these outfits opt for controller-less networks, with the option of upgrading to controllers down the road if their businesses grow.

It’s not too soon to look beyond 11ac, either. The IEEE approved the 802.11ad (WiGig) standard back in early 2013 for high-speed networking in the unlicensed 60GHz radio spectrum band, and the WiFi Alliance will likely be establishing a certification program for this within the next year or so.

Aruba’s Dorothy Stanley, head of standards strategy, says 11ad is “not really about replacing the W-Fi infrastructure, but augmenting it for certain apps.”

She says it could have peer-to-peer uses, and cites frequently-talked about scenarios such as downloading a movie or uploading photos at an airport kiosk. These are applications that would require only short-range connections but involve heavy data exchanges.

Stanley adds that developing and manufacturing 11ad products has its challenges. Nevertheless, big vendors such as Cisco and Qualcomm (via its Wilocity buyout) have pledged support for the technology.

“It’s something everybody is looking at and trying to understand where its sweet spot is,” Stanley says. “The promise of it is additional spectrum for wireless communications.”

Another IEEE standards effort dubbed 802.11ax is the most likely successor to 11ac, and has a focus on physical and media-access layer techniques that will result in higher efficiency in wireless communications.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Microsoft has yet to provide a solution for customers who can’t connect to Microsoft Update to install last week’s out-of-band patch KB 3011780

The causes of the problem remain cloudy, but the symptoms are quite clear. Starting on Nov. 18, some Server 2003, Windows Home Server 2003, and Windows XP SP3 machines suddenly refused to connect to Microsoft Update. As best I can tell, Microsoft has not responded to the problem, not documented a workaround, and is basically doing nothing visible to fix it.

10 (FREE!) Microsoft tools to make admins happier
(Keep in mind that, although Windows XP is no longer supported, Security Essentials updates for XP still go through Microsoft Update, and all old patches for XP are still available — when Microsoft Update is working, anyway.)

The main TechNet thread on the subject says the error looks like this:

The website has encountered a problem and cannot display the page you are trying to view. The options provided below might help you solve the problem.

Error number: 0x80248015

There are also lengthy discussions on the SANS Internet Storm Center site and on the MSFN site.

Some people have reported that simply setting the system clock back a couple of weeks and re-running the update bypasses whatever devils may be lurking. For most, though, that approach doesn’t work.

Alternatives range from deleting the C:\WINDOWS\SoftwareDistribution folder to running wuauclt.exe /detectnow to chasing chicken entrails. Some of the fixes work on some machines, others don’t.

Poster Steve on the SANS ISC thread noted an important detail. On machines that get clobbered, when you look at the files C:\WINDOWS\SoftwareDistribution\ AuthCabs\muauth.cab and C:\WINDOWS\SoftwareDistribution\ AuthCabs\authcab.cab you see a suspicious entry:

<ExpiryDate>2014-11-17T17:27:43.5251853-08:00</ExpiryDate>

That just happens to coincide, more or less, with when Microsoft Update started to fail.

I looked at a couple of machines that are still working fine, and the authcab.cab file on them has this entry:

<ExpiryDate>2018-06-01T17:27:43.5251853-08:00</ExpiryDate>

The muauth.cab file has the 2014 <ExpiryDate>.

I have no idea why the authcab.cab file on some machines has the 2014 date, while others have the 2018 date. But this may be the telltale sign differentiating machines that can still connect to Microsoft Update from those that don’t.

Poster b3270791 on the MSFN thread has a solution that seems to work, but it involves replacing the muweb.dll file on the broken machines with an earlier muweb.dll file downloaded from the Internet. While that approach doesn’t exactly exhibit world-class security best practices, it does seem to work.

Does anybody at Microsoft give a hang, with XP already officially out to pasture and Server 2003 due to follow it to the glue farm on July 14, 2015?

Server 2003 admins have been twiddling their thumbs for a week, unable to install that out-of-band patch.

XP users are affected, too, but who cares? Microsoft’s making good on its promise to deliver Security Essentials updates to XP customers. If the customers can’t install them, well, that’s just one of those nasty implementation details, you know.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Traces of Regin malware may date back to 2006

Written by admin
November 24th, 2014

Regin was known about for some time by the security industry, according to Symantec

Malware that Symantec says was probably developed by a nation state may have been used for as long as eight years, a length of time that underscores the challenges the security industry faces in detecting advanced spying tools.

On Sunday, the computer security company published a 22-page report and blog post on the Regin malware, which it described as a powerful cyberespionage platform that can be customized depending on what type of data is sought.

It was predominantly targeted at telecoms companies, small businesses and private individuals, with different modules customized for stealing particular kinds of information. Symantec found about 100 entities infected with Regin in 10 countries, mostly in Russia and Saudi Arabia, but also in Mexico, Ireland, India, Afghanistan, Iran, Belgium, Austria and Pakistan

A first version of Regin was active between 2008 and 2011. Symantec began analyzing a second version of Regin about a year ago that had been forwarded by one of its customers, said Liam O’Murchu, a Symantec researcher, in a phone interview Sunday.

But there are forensic clues that Regin may have been active as far back as 2006. In fact, Symantec didn’t actually give Regin its name. O’Murchu said Symantec opted to use that name since it had been dubbed that by others in the security field who have known about it for some time.

If Regin does turn out to be 8 years old, the finding would mean that nation states are having tremendous success in avoiding the latest security products, which doesn’t bode well for companies trying to protect their data. Symantec didn’t identify who it thinks may have developed Regin.

Symantec waited almost a year before publicly discussing Regin because it was so difficult to analyze. The malware has five separate stages, each of which is dependent on the previous stage to be decrypted, O’Murchu said. It also uses peer-to-peer communication, which avoids using a centralized command-and-control system to dump stolen data, he said.

It’s also unclear exactly how users become infected with Regin. Symantec figured out how just one computer became infected so far, which was via Yahoo’s Messenger program, O’Murchu said.

It is possible the user fell victim to social engineering, where a person is tricked into clicking on a link sent through Messenger. But O’Murchu said it is more likely that Regin’s controllers knew of a software vulnerability in Messenger itself and could infect the person’s computer without any interaction from the victim.

“The threat is very advanced in everything it does on the computer,” O’Murchu said. “We imagine these attacks have quite advanced methods for getting it installed.”

Telecom companies have been particularly hard hit by Regin. Some of the companies have been infected by Regin in multiple locations in multiple countries, Symantec found.

The attackers appear to have sought login credentials for GSM base stations, which are the first point of contact for a mobile device to route a call or request data. Stealing administrator credentials could have allowed Regin’s masters to change settings on the base station or access certain call data.

Regin’s other targets included the hospitality, airline and ISP industries, as well as government.

“We do not think [Regin] is a criminal type of enterprise,” O’Murchu said. “It’s more along the lines of espionage.”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Where to find security certifications

Written by admin
November 19th, 2014

Some say they are essential to a successful security career. Others argue they are an outdated concept and a waste of time. Despite the debate, here are 10 places to further learn about the security trade and the certifications required for some jobs in the info sec field.

Security certifications
The debate rages on whether gaining security certifications means much. Regardless of whether you think they aren’t even worth the paper they are printed on, there are others who believe certifications prove the individual knows what they are doing. With that, here are a group of vendors who offer security certifications.

CERT
According to its site: “We were there for the first internet security incident and we’re still here 25 years later. We’ve expanded our expertise from incident response to a comprehensive, proactive approach to securing networked systems. The CERT Division is part of the Software Engineering Institute, which is based at Carnegie Mellon University. We are the world’s leading trusted authority dedicated to improving the security and resilience of computer systems and networks and are a national asset in the field of cybersecurity.”

Certified Wireless Network Professional
“The CWSP certification is a professional level wireless LAN certification for the CWNP Program. The CWSP certification will advance your career by ensuring you have the skills to successfully secure enterprise Wi-Fi networks from hackers, no matter which brand of Wi-Fi gear your organization deploys.”

CompTIA
CompTIA has four IT certification series that test different knowledge standards, from entry-level to expert. A list of certifcations can be found here.

Global Information Assurance Certification
According to its site: “Global Information Assurance Certification (GIAC) is the leading provider and developer of Cyber Security Certifications. GIAC tests and validates the ability of practitioners in information security, forensics, and software security. GIAC certification holders are recognized as experts in the IT industry and are sought after globally by government, military and industry to protect the cyber environment.”

Information Assurance Certification Review Board
As stated in its charter: “The Board will sponsor a world-class certification set, that meets or exceeds the needs of organizations and individuals wishing to hold candidates to the highest possible level of professional certification in the area of information security. As much as it is feasible, the Board will remain independent from any commercial organization.”

The IACRB currently offers certifications for 7 job-specific responsibilities that reflect the current job-duties of information security professionals.

International Information Systems Security Certification Consortium
According to its Web site: “The (ISC)² CBK is the accepted standard in the industry and continues to be updated to reflect the most current and relevant topics required to practice in the field.”

To find out more about what it offers, go here.

ISACA
“As an independent, nonprofit, global association, ISACA engages in the development, adoption and use of globally accepted, industry-leading knowledge and practices for information systems,” according to its Web site.

To find out more about the training offered, go here.

McAfee Institute
McAfee’s motto is “Your Place to Learn Real-World Skills & Advance your Crime-Fighting Career!”

Find out more about its certifications here.

Mile2
Mile2 is a developer and provider of proprietary vendor neutral professional certifications for the cyber security industry. MIle2 has a laundy list of internationally recognized cyber security certifications.

Security University
According to its site: “Since 1999, Security University has led the professional cybersecurity education industry in hands-on information security training & education. Security University provides uniform IT security workforce training with performance based, tactical hands-on security skills that qualify and validate the workforce so less people can do the same job or more, with consistent cybersecurity skills.”

If you have taken other certification courses, please let us know how they went in the comments section.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

6 ways to maximize your IT training budget

Written by admin
November 15th, 2014

Customized, in-house training often zeros in on topics relevant to your business. However, it comes with an equally high price tag. If your employees simply need to fill in knowledge gaps or get up to speed with a specific software package, there are a plethora of affordable, flexible options for even the most limited budgets.

Although the economy is picking up ever so slightly, IT departments remain on the lookout for ways to do more with less – fewer people, fewer resources, less money. That’s why learning how to stretch the training budget as far as possible can pay significant dividends. This is true both for those organizations seeking to develop employee skills and knowledge for the least expenditure, and for employees looking to improve and enhance their career potential and longevity.

If an organization can get its employees to buy into training and career development, they can literally double their dollars when costs get split 50-50. This is already an implicit aspect in many tuition support programs, where employers offer a partial stipend or payment to help cover the costs of academic coursework. Why not make it a part of how IT training dollars get spent, too?

Some IT departments offer their employees a menu of courses or certifications from which employees can choose, coupled with (partial) reimbursement plans to help defray their costs. By offering more support for those credentials it needs the most, and less for those credentials outside the “must-have” list, organizations can steer employees in the directions they want them to go.
Negotiate Discounts to Control Costs

Times are tough for training companies, too. If you do want to buy into online or classroom training, you’ll get a better bang from your budget if you negotiate a “group rate” of sorts to cover some or all of your training needs.

Although online or virtual classes may not be as popular as instructor-led in-class training, remote offerings usually cost less to begin with; obtaining additional discounts will help leverage such spending even further. Some training companies offer subscriptions to their entire training libraries on a per-seat, per-month basis.

Pluralsight offers its extensive training catalog to individuals for about $50 a month, for example, and its business offerings include progress tracking and assessments for enrolled employees, as well as library access for some number of individuals. A 10-user license costs about $25 per month, per individual user for a Basic package, and double that for their Plus package, which adds exercises, assessments and offline viewing to the basic ability to watch courses online on a PC or mobile device.
Purchase Key Items in Bulk

If you know you need to run a team of system engineers or senior tech support staff through a specific curriculum that includes certain certification exams, and you can hold those people to a schedule, then you can purchase exam voucher or training/voucher bundles at a discount. As the purveyor of many popular and high-demand cert exams, and a publisher of copious related training materials, Pearson VUE/Pearson Education offers much of what employers need for such programs. Contact the Voucher Store to inquire about volume purchase pricing and arrangements.

(Note: The author writes on an occasional basis for InformIt, a professional development branch of Pearson, and on a frequent basis for the Pearson IT Certification blog.)
Assemble Employee Study Groups and Resources

Just a little added support for employees involved in training, or preparing for certification, can help organizations realize better results from (and returns on) their training investments. Consider some or all of the following strategies to help employees make the most of their training experience and get the best value for your training dollars

Set up a wiki or online forums/chat rooms on a per-topic or per-exam basis for employees to use and share.
Encourage employees to share their best resources, learning materials, study techniques and so forth with one another. Build compendia of such materials and pointers for ongoing sharing.
Provide access to practice tests, exercises and simulated or virtual labs for hands-on work so employees can check their learning, buttress their weak spots and develop a well-rounded understanding of training materials, exam objectives and coverage.
Identify local subject matter experts to whom training and certification candidates can turn for added information and explanation when the

Because many employees will be interested in these kinds of things, you can find volunteers to help create and maintain these kinds of low-cost but high-value training and prep tools and resources.

Provide Recognition and Rewards to Those Who Succeed

Sure, it would be nice if everyone who earns a certification or masters some new body of knowledge could get a 25 percent raise and/or a promotion as a consequence of completing a program of some kind. In some cases, such rewards may even be required to retain employees who earn coveted credentials such as the Cisco CCIE, (ISC)2 CISSP or the ITIL Master Qualification.

However, even small rewards, such as a $100 gift certificate for a family night out or a gift card to a local department store can communicate your appreciation to those employees who manage to chew, swallow and digest what they must bite off to pursue training and certification. A public pat on the back in the employee newsletter or at a period employee meeting doesn’t hurt, either. Recognition provides added impetus for employees to finish what they start and shows them that you value the time and effort they must expend in pursuing training and certification.
Ask for Ideas and Suggestions, Then Act Upon Them

Beyond the various methods to stretch your training budget outlined here, you can also turn to your target audience to ask how it thinks you can maximize the return on training and certification. You may be surprised by the quality and quantity of resulting feedback. Most employees respond positively to on-the-job opportunities for career and professional development. They, too, understand that the likelihood of continuing support rests on the outcomes of their training and certification efforts. In the end, they know full well that, by helping the organization excel and improve, they too will benefit from improved job and pay prospects.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

3 power-sipping monitors lower energy bills

Written by admin
November 8th, 2014

Can you have a great monitor that also scrimps on electricity — and helps the environment? We test three 27-in. power-saving displays to find out.

While businesses have always been careful about how much they spent on electricity, today’s displays are making it a lot easier to keep bills lower — and the environment safer.
Review

For example, I measured a four-year-old 27-in. Apple Cinema Display as using 101 watts of power. However, the three 27-in. displays that I’ve tested in this roundup of environmentally smart monitors — AOC’s E2752VH, Dell’s UltraSharp 27 UZ2715H and the Philips 271S4LPYEB — use an average of 26.4 watts.

This is also reflected in cost in electricity to use the monitor per year. Assuming the display is used for 10 hours every business day and that power costs 12 cents per kilowatt-hour (the national average), the Apple Cinema display costs $29 for the year while the three reviewed here average an annual cost of $9.32. Doesn’t sound like a lot, but if you’re a business with a couple of hundred employees, it can add up quickly.

In addition, all three of these displays all carry the EPA’s EnergyStar logo and boast wide screens that can display full HD resolution. And according to the EPA’s Greenhouse Gas Equivalencies Calculator, every kilowatt-hour of power saved equals 1.5 lbs. of carbon dioxide that isn’t spewed into the atmosphere.

Interestingly, these displays have different strategies as to how they reduce their power consumption. The Dell screen relies on the computer’s screen saver to signal when it’s time to go to sleep. The AOC adds a built-in timer for determining when it turns the screen off. And the Philips display includes a pair of infrared sensors that continually scan to see if someone is sitting front of the screen. When you get up, it senses that the space in front is empty and shuts the screen down, reducing its power draw.

Of course, saving on electricity doesn’t mean a thing if a display doesn’t excel in its main purpose. To see what these displays have to offer, I used them every day for a couple of months in my office. I took turns with them one-on-one and then viewed them together showing the same material for comparison.

Saving a few kilowatts here and there might not sound like a huge savings. But, if you multiply the savings by the number of monitors in use every day in a company’s offices, it adds up quickly.

If you’re looking for a frugal monitor, AOC’s E2752VH is not only the cheapest display of the three, but is the only one that uses no measurable power when in sleep mode. However, it falls short on creature comforts like a webcam and USB ports.

Like the others reviewed here, the AOC monitor uses a 27-in. IPS panel with a 1920 x 1080 resolution. It features an ultra-fast 2-millisecond (ms) response time, versus 5ms and 8ms for the Philips and Dell displays, respectively.

The all-black casing is broken only by a small blue light in the lower right corner to show it’s turned on. On the right side of the monitor’s front, there are controls for turning it on and off, raising and lowering the volume, and using the on-screen menu. The marking for each switch’s function is embossed in the display’s plastic case; classy, but I found the highlighted white markings on the other two displays easier to read.
Saving power

The AOC monitor has two techniques for saving power when it’s not being used. First, like the Dell display, it can use the computer’s screen saver to trigger its sleep mode. It can also be configured to shut down when the computer is off or goes to sleep. In addition, a timer lets you shut down the screen after a period of inactivity. Unfortunately, the time can be configured in increments of one hour only.

The AOC consumed 27.5 watts when being used, a little more than the Dell display’s power profile. Unlike the others, when the AOC screen goes to sleep, it uses no discernible power, compared to 1.1 watts and 2.1 watts for the Dell and Philips displays, respectively. It took 3.1 seconds to wake up.

Assuming it is used for 10 hours every business day and power costs 12 cents per kilowatt-hour (the national average), the AOC should cost an estimated $8.30 to use per year. That makes it the cheapest of the three to use, if only about $2 a year less than the Philips monitor.
How well it worked

At 215 candelas per square meter, the AOC’s light output was the lowest of the three; to my eyes, it looked visibly dimmer than the Dell monitor. Its color balance appeared accurate with strong blues and reds. Video play was smooth, with no lags or glitches.

In addition to a standard mode, the display has settings for text, Internet, games, movies and sports. For those who want to tweak the output, the monitor has adjustments for brightness, contrast, gamma and color temperature. Unfortunately, temperature settings are restricted to normal, warm, cool and sRGB settings. You can fiddle with the red, blue and green colors, but I preferred using the Philips’s more extensive presets that are based on actual color temperatures.

The monitor also comes with two Windows-only apps. iMenu lets you adjust brightness, contrast and gamma, but lacks the calibration patterns of the Philips display. Interestingly, several of the program’s labels are in Chinese characters, making the app hard to fathom without the manual.

The eSaver app is how you tell the monitor when to go to sleep, based on the status of your PC. For example, I set it to turn off one minute after the computer is shut down or 10 minutes after the computer goes to sleep or the screen saver comes on. Neither of the other monitors reviewed here can match this specificity.

The AOC display makes do with two 2.5-watt speakers; there is no webcam, microphone or USB hub. The speakers are on the bottom edge of the display, so that they sound thin and don’t get nearly as loud as the Dell’s sound system.
Other features

Its assortment of ports (one DVI, one HDMI and one VGA) lacks the Dell’s second HDMI port and the Philips’s DisplayPort input. But the AOC ports are all horizontally oriented, while the other two displays have vertical ports that are more awkward to plug in.

I did appreciate the addition of an analog audio input, which I used to connect to my phone’s output to listen to music while working. The display also has a headphone jack in the back.

The AOC stand was the easiest of the three to set up, because the base snaps into the monitor arm. Like the others, the monitor has standard VESA mounting holes on the back for screwing it into a third-party stand. However, the only way to adjust the stand is to tilt it up to 3 degrees forward or up to 17 degrees back. It can’t go up and down, swivel or rotate.
Bottom line

With a three-year warranty, the AOC is available at prices starting under $200, the least expensive of the three. If you just need a basic monitor that can offer some savings in electric bills, this is a good choice, but its lack of a webcam among other features may limit its usefulness.

The Dell UltraSharp 27 may be the most expensive of the three displays reviewed here, but it delivers the best mix of screen and multimedia accessories.

The gray and black monitor takes up the least desktop space of the three, something to consider if you’re part of a company that is tight on cubicle space. Built around an IPS panel that offers 1920 x 1080 resolution, the Dell uses hardened anti-glare glass.

There are controls up front for turning the display on and off, using the on-screen menu, and turning the volume up or down; there’s also a handy mute button. I was surprised and impressed by a button with a telephone receiver icon that can initiate or answer a phone call over Microsoft’s Lync VoIP system. (To get this to work, you’ll need to link the screen with a PC via a USB cable.)

The Dell comes with a seductive-sounding PowerNap feature, which triggers the display’s sleep mode when the computer’s screen saver comes on. The monitor first dims the screen’s brightness and then shuts itself down. The screen comes back on when the host computer’s screen saver shuts off. In my tests, the screen woke up in 1.5 seconds.

While being used, the Dell UltraSharp 27 consumed 23.6 watts of power, the least amount of the three. This drops to 1.1 watts when in sleep mode, half what the Philips monitor uses in the same mode.

Based on a typical usage scenario (assuming it’s on for 10 hours a day for every business day and in idle mode the rest of the time, and that power costs 12 cents per kilowatt-hour), this adds up to an estimated annual power cost of $9.45, halfway between the higher-cost Philips and less expensive AOC monitors.
How well it worked

The display was able to deliver 246 candelas per square meter of brightness, the brightest of the three reviewed here. Its reds and greens were spot on, but the screen’s blues appeared slightly washed out. The screen was able to render smooth video; however, its video response time of 8ms is the slowest of the three.

The Dell monitor’s on-screen menu has controls for tweaking brightness, contrast and sharpness as well as adjusting the display’s gamma settings for using a Windows PC or a Mac. To do any meaningful customization, though, you’ll need to load the included Display Manager software. This application, which only works with Windows PCs, includes the ability to change the display’s color temperature as well as choose among Standard, Multimedia, Gaming or Movie modes.

The Dell is a fine all-around monitor; it excels at delivering all the audio-visual accessories that a modern desktop requires. These include an HD webcam for video conferences as well as a dual-microphone array that does a good job of capturing your voice while reducing noise.

The display’s pair of speakers sounded surprisingly good and can actually get too loud for an office. There is a headphone jack on the side, but the Dell lacks the AOC’s audio-in jack.
Other features

The Dell display has the best assortment of ports as well, including one VGA, one DisplayPort and two HDMI ports, both of which can work with an MHL adapter and a compatible phone or tablet. The ports are oriented vertically rather than the more convenient horizontal orientation of the AOC display.

Unlike the others, the display has two USB 2.0 ports and a single USB 3.0 port. All of its cables can be routed through a hole in the back of the monitor’s stand to keep them tidy. On the other hand, the stand’s adjustability is limited: It tilts forward by 5 degrees and back by 22 degrees, but it can’t go up and down, rotate or swivel.

As is the case with the AOC and Philips displays, you can remove the display from the stand and use its VESA mounting holes for use with a third-party stand or mounting hardware. You just need to press a spring-loaded button to release the panel from the stand.
Bottom line

The Dell UltraSharp 27 includes a three-year warranty and it has a list price of $450, considerably more than the AOC’s cost. (Note that the cost of the Dell changed several times while this review was being written and edited.) But given that, the Dell provides all the accoutrements needed for doing desktop work without wasting power.

As minimalist as a monitor gets these days, the Philips 271S4LPYEB is not only power-aware but knows when you’re is sitting in front of it and can automatically go to sleep when you’re not. Too bad it lacks creature comforts like a webcam, speakers or even an HDMI port.

The all-black display houses a 1920 x 1080 IPS panel that is rated at 5ms response time.

Perhaps the most interesting feature is pair of infrared sensors that perceive whether someone is sitting in front of the screen. Called PowerSensor, the system can be set to four different distances (user to screen) between 12 in. and 40 in. It’s quite an impressive trick. One minute after the space in front of the display is vacated, the image dims; two minutes later, the screen goes black. Then, like magic, the screen lights back up when you sit in front of it. When I tried it, the screen came back to life in less than a second.

After some fiddling (to figure out which distance setting was best for me), I found it worked well and quickly responded to my absence and return. I was able to fool it, though, by leaving my desk chair with its back to the screen.

The Philips used 28.8 watts of power in Office mode, which was similar to the standard modes of the other two displays (however, power use varied only slightly with the other modes). When the PowerSensor kicked in, the power demand was initially reduced to 10.6 watts for one minute and then to 2.1 watts.

Ironically, though, the Philips turned out to be the highest power user of the three — probably because of the overhead required to keep the PowerSensor active and ready to restart the display. All told, using my assumptions that it was used for 10 hours a day for every business day and that electricity costs 12 cents per kilowatt-hour, the display had an estimated annual operating expenses of $10.20.

In the front, the Philips monitor has a control for fine-tuning the PowerSensor along with others for turning the display on and off and working with the screen’s menu. A large bluish-green LED shows that the display is turned on. There are also buttons for adjusting the brightness level and selecting the company’s SmartImage feature.

SmartImage optimizes the display’s contrast to suit what you’re looking at. It has preset modes for Office, Photo, Movie, Game or Economy (which reduces its brightness by two-thirds). There’s also an adjustment for the screen’s color temperature with six settings available between 5,000K and 11,500K.

With the ability to deliver 221 candelas per square meter, the Philips monitor delivered rich blues and sharp yellows, but the display’s greens were too light and its reds appeared dull. Its ability to show video was very good — clear and smooth with no frame drops.

Loading the included Smart Control Premiere app (Windows PCs only) provides a deeper level of customization. It has the ability to change the screen’s black level and adjust the gamma settings. A big bonus is that it has a series of test images that you can use to calibrate the display.
Other features

While the AOC and Dell monitors have built-in speakers, the Philips lacks speakers, webcam, microphone and USB ports. In other words, it is a display and nothing more — rather unusual in today’s market.

Its collection of input ports are oriented vertically rather than the AOC display’s more convenient horizontal ports. The Philips has one DisplayPort, one DVI and one VGA port, but no HDMI port. As a result, I used its DVI input with an HDMI adapter.

The Philips does offer the best stand of the trio. With little effort, the display can be tilted forward 5 degrees and back by up to 20 degrees; it can also be raised or lowered by 6.3 in. and swiveled to the right or left by up to 140 degrees.

The entire display can also be easily rotated from landscape to portrait. This is useful if you want to work with a long document or a vertically oriented website without continually scrolling. The monitor’s software reorients the image after the screen is rotated.

After pressing a button in the back, you can remove the display from the stand, revealing its VESA mounting holes. This allows it to be used with a third-party stand or mounting hardware.
Bottom line

The Philips display comes with a three-year warranty and starts at a retail price of about $260, between the cheaper AOC monitor and the better equipped Dell display. While I love the display’s ability to sense when I’m working and when I’m someplace else — and the well-constructed stand — the Philips really needs some further refinement and power reduction before it’s ready for my office.

After using each of these three monitors for several weeks, I would love an amalgam of the three that is built around the Philips adaptable stand, the AOC’s power-saving abilities and the Dell’s bright screen.

That said, the PowerSensor feature on the Philips 271S4LPYEB is impressive and works well, but it uses too much electricity to be of much use.

I love that the AOC E2752VH doesn’t use a watt when it’s asleep. At $240, it is also the cheapest to get and use, but that’s not enough compensation for having the least bright monitor of the three.

The Dell UltraSharp 27 UZ2715H may not have the fastest display panel, but it is fine for business work and is the best equipped and brightest of the three — and uses a reasonable amount of power. I wish that the stand were more adaptable, but no other screen here does so much.

To see how these 27-in. monitors compare, I set each up in my office for at least a week as my primary display. I used each of them to write emails, edit text, create spreadsheets, watch videos, nose around on the Web and work with interactive online programs.

After unpacking and putting each together, I spent some time measuring and investigating how each stand can tilt, raise or rotate the screen. Then I looked over the display’s ports, speakers, microphone and webcam. I looked at the monitor’s controls and tried out the device’s features.

Then I connected each of the monitors to an iPad Mini (with an HDMI adapter), a Toshiba Radius P-55W notebook and a Nexus 7 phone (connecting via a Chromecast receiver). Each screen was able to work with each source; since the Philips display lacks an HDMI port, I used its DVI port with an HDMI-to-DVI adapter.

I next measured each screen’s brightness with a Minolta LM-1 light meter using a white image in a darkened room. After measuring the light level at nine locations, I averaged them and converted the result to candelas per square meter. I then displayed a standard set of color bars and compared the three displays using an Orei HD104 four-way video distribution amplifier and a Toshiba Radius computer as the source.

To see how these monitors save power, I looked into their power conservation settings and software. I checked out how flexible the setting was for putting the display to sleep and measured how much electricity each monitor used with a Kill a Watt power meter.

Using the average U.S. price of 12 cents per kilowatt-hour of electricity, I estimated of how much it might cost to operate each monitor, based on the assumption that it was used for 10 hours a day over the work year (250 days) and was asleep for the rest of the time.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Although degrees and IT certifications can be great eye candy for a resume, experience is king. As you may have encountered, a lack of experience can be a major roadblock to getting interest from employers in your early years.

Though you might have the Network+ or CCNA cert, for instance, have you actually configured or played around with a network? Even if you already have held a network technician or administrator position, you might not have experience with all aspects of networking yet. Fortunately there are ways to get hands-on network administration experience, even at home — and most don’t cost anything.

In this story we discuss nine self-taught labs on various networking topics, where I explain the basics and how to get started. I begin with easier, newbie-level projects and progress to more complex ones requiring more thought and time. Some of the tasks take just a few minutes, while others are suitable for a weekend project. You may want to invest in some networking gear to teach yourself the basics, but there are ways around this.

Beginner
Project 1: Configure TCP/IP Settings

One of the most basic network admin tasks is configuring the TCP/IP settings. If a network isn’t using the Dynamic Host Configuration Protocol (DHCP) that automatically hands out IP addresses after clients connect, you’ll have to manually set static IP and DNS addresses for each client. You may also be required to temporarily set static IP information when doing the initial setup and configuration of routers or other network components.

To set static IP details you must know the IP address of the router and the IP address range in which you can configure the client. You can figure this out from the settings of a computer already successfully connected to the network.

You’ll need the IP address as well as the Subnet Mask, the router’s IP address (a.k.a. the Default Gateway) and the Domain Name System (DNS) Server addresses.

In Windows: Open the Network Connections via the Control Panel or Network and Sharing Center. Next, open the connection that is already on the network and click the Details button.
In Mac OS X: In System Preferences, click the Network icon, then select the connection that is already on the network, such as AirPort (wireless) or Ethernet (wired). With a wired connection you’ll likely see the info you need on the first screen; for a wireless connection, additionally click the Advanced button and look under the TCP/IP and DNS tabs.

Write the numbers down or copy and paste them into a text file, and then close the window.
What Readers Like

ios 8 problems
iOS 8 problems not so magical: Slow, Laggy, Bloaty, Crashy, Buggy, Drainy and… scarlett johansson naked selfie 2 Naked celebs: Hackers download sext selfies from iCloud #thefappening
iphone 6 size comparison I thought the iPhone 6+ was too big; I was wrong subnet calculator A subnet calculator shows the acceptable IP address range for a network.

To see the acceptable IP address range for the network, you can input the IP address and Subnet Mask into a subnet calculator. For example, inputting the IP of 192.168.1.1 and Subnet Mask of 255.255.255.0 shows the range of 192.168.1.1 to 192.168.1.254.

Even though you now know the IP address range, remember that each device must have a unique IP. It’s best to check which IP addresses are taken by logging into the router, but you could also take an educated guess or simply choose a random address within the range. If the address is already taken by another device, Windows or OS X will likely alert you of an IP conflict and you can choose another. Once the IP address is set, write it down or save it in a document; a best practice is to keep a log of all the static IPs along with the serial numbers of the computers that use them.

TCP/IP settings Manually setting the computer’s IP address.

Now, to set a static IP address:
In Windows: Open the Network Connection Status window, click the Properties button and open the Internet Protocol Version 4 (TCP/IPv4) settings. Choose “Use the following IP address” and enter the settings: an IP address that’s in the acceptable range, plus the Subnet Mask, Default Gateway and DNS Server from the Network Connection Details window.
In Mac OS X: Open the Network window and click the Advanced button. On the TCP/IP tab, click the drop-down next to Configure IPv4, choose Manually and enter an IP address that’s in the acceptable range, plus the Subnet Mask and router address you copied earlier. Go to the DNS tab and enter the DNS Server address you copied before.

As a network admin, you’ll likely help set up, troubleshoot and maintain the wireless portion of the network. One of the most basic tools you should have is a Wi-Fi stumbler. These tools scan the airwaves and list the basic details about nearby wireless routers and access points (APs), including the service set identifier (SSID), also known as the network name; the MAC address of the router/AP; the channel; the signal level; and the security status.

You can use a Wi-Fi stumbler to check out the airwaves at home or at work. For instance, you can check which channels are being used by any neighboring wireless networks so you can set yours to a clear channel. You can also double-check to ensure all the routers or access points are secured using at least WPA or WPA2 security.
NetSurveyor The NetSurveyor stumbler gives a text-based readout and visual charts of wireless channel usage and signals.

Vistumbler and NetSurveyor (for Windows), KisMAC (for OS X) and and Kismet (for both plus Linux) are a few free options that give both text-based readouts and visual charts of the channel usage and signals. Check out my previous review of these and others.
Wifi Analyzer The Wifi Analyzer app provides a nice visualization for channel usage.

If you have an Android phone or tablet, consider installing a Wi-Fi stumbler app on it for a quick, less detailed look at the Wi-Fi signals. Wifi Analyzer and Meraki WiFi Stumbler are two free options. See my previous review of these and others.
Project 3: Play with a wireless router or AP

To get some experience with setting up and configuring wireless networks, play around with your wireless router at home. Or better yet, get your hands on a business-class AP: See if you can borrow one from your IT department, check eBay for used gear or consider buying new equipment from lower-cost vendors such as Ubiquiti Networks, where APs start at around $70.

To access a wireless router’s configuration interface, enter its IP address into a web browser. As you’ll remember from Project 1, the router’s address is the same as the Default Gateway address that Windows lists in the Details window for your wireless network connection.

Accessing an AP’s configuration interface varies. If there’s a wireless controller, it’s the one interface you’ll need to configure all the APs; with controller-less systems you’d have to access each AP individually via its IP address.

Once you’ve accessed the configuration interface of your router or AP, take a look at all the settings and try to understand each one. Consider enabling wireless (or layer 2) isolation if supported and see how it blocks user-to-user traffic. Perhaps change the IP address of the router/AP in the LAN settings and/or for routers, disable DHCP and statically assign each computer/device an IP address. Also consider setting a static DNS address (like from OpenDNS) in the WAN settings. You might also look into the Quality of Service (QoS) settings to prioritize the traffic. When you’re done experimenting, make sure it’s set to the strongest security — WPA2.
typical AP interface I’ve statically assigned this AP an IP address and DNS servers.

If you can’t get your hands on a business-class AP, consider playing around with interface emulators or demos offered by some vendors, as Cisco does with its small business line.
Intermediate

Project 4: Install DD-WRT on a wireless router
For more experimentation with wireless networking, check out the open-source DD-WRT firmware for wireless routers. For compatible routers, DD-WRT provides many advanced features and customization seen only in business- or enterprise-class routers and APs.

For instance, it supports virtual LANs and multiple SSIDs so you can segment a network into multiple virtual networks. It offers a VPN client and server for remote access or even site-to-site connections. Plus it provides customizable firewall, startup and shutdown scripts and supports a few different hotspot solutions.
DD-WRT DD-WRT loads a whole new feature set and interface onto the router.

For more on DD-WRT and help on installing it on your router, see “Teach your router new tricks with DD-WRT.”
Project 5: Analyze your network and Internet traffic

As a network admin or engineer you’ll likely have to troubleshoot issues that require looking at the actual packets passing through the network. Though network protocol analyzers can cost up to thousands of dollars, Wireshark is a free open-source option that works on pretty much any OS. It’s feature-rich, with support for live and offline analysis of hundreds of network protocols, decryption for many encryption types, powerful display filters, and the ability to read/write via many different capture file formats.

Wireshark Wireshark capturing network packets.

Once you get Wireshark installed, start capturing packets and see what you get. In other words, browse around the Web or navigate network shares to see the traffic fly. Keep in mind you can stop the live capturing to take a closer look. Although Wireshark can capture all the visible traffic passing through the network, you may see only the traffic to and from the client whose packets you’re capturing packets if the “promiscuous” mode isn’t supported by your OS and/or the network adapter. (For more information, see the Wireshark website.)

Note: Even though packet capturing is usually only a passive activity that doesn’t probe or disturb the network, some consider monitoring other people’s traffic a privacy or policy violation. So you don’t get into trouble, you ought to perform packet capturing only on your personal network at home — or request permission from management or the CTO before doing it on your work network. In fact, you should clear it with management before doing any monitoring or analysis of a company or school network.

There are other free network analyzers you might want to experiment with. For instance, the EffeTech HTTP Sniffer can reassemble captured HTTP packets and display a Web page, which can visually show you or others what’s captured rather than looking at the raw data packets. Password Sniffer “listens” just for passwords on your network and lists them, which shows just how insecure clear-text passwords are. And for mobile analysis via a rooted Android phone or tablet, there are free network analyzers like Shark for Root.

Project 6: Play with network emulators or simulators
Though you might not be able to get your hands on enterprise-level network gear for practicing, you can use emulators or simulators to virtually build and configure networks. They can be invaluable tools for preparing for IT certifications, including those from Cisco and Juniper. Once you create virtual network components and clients you can then configure and administer them with emulated commands and settings. You can even run network analyzers like Wireshark on some, to see the traffic passing through the network.

Here are a few of the many emulators and simulators:
The GNS3 Graphical Network Simulator is a popular free and open source choice. It requires you to supply the OS, such as Cisco IOS or Juniper’s Junos OS, which usually requires a subscription or support contract from the particular vendor, but you may be able to get access via the IT department at work or school.

GNS3 interface GNS3 Graphical Network Simulator supports Cisco IOS/IPS/PIX/ASA and Juniper JunOS.

Netkit is another free and open source option. It doesn’t include vendor-specific functionality and is limited to generic networking components, but it also doesn’t require you to have the OSes as GNS3 does.

The Boson NetSim Network Simulator is a commercial offering with pricing starting at $99; its intent is to teach Cisco’s IOS. It offers a free demo download, but that functionality is greatly limited.

There are also websites, such as SharonTools and Open Network Laboratory, that offer remote admin access to network components and Web-based emulators for you to practice with commands. Network World has a nice roundup of free Cisco simulators and emulators.
Advanced

Project 7: Perform penetration testing on your own network
You can read and read about network security, but one of the best ways to learn about or to verify security is by penetration testing. I don’t mean you should snoop on your neighbors or hack a business; try it your own network so you don’t end up in the slammer.

Perhaps find a network vulnerability that interests you, research how to take advantage of it and, once you’ve done the hack, make sure you understand how it was possible. And always ensure your network and those you administer are protected from the vulnerability.

Here are a few hacks you could try:
Crack Wi-Fi encryption — WEP is the easiest — with Aircrack-ng. Crack a Wi-Fi Protected Setup (WPS) registrar PIN with Reaver-WPS to gain access to a wireless router.
Hijack online accounts via Wi-Fi using the Firefox add-on Firesheep or the Android app DroidSheep.
Capture and crack 802.1X credentials using FreeRadius-WPE.

When researching, you’ll likely find how-to tutorials on exactly how to do the hacks and what tools you need. One popular tool that’s filled with hundreds of penetration testing tools is the BackTrack live CD, but the project is currently not maintained. However, Kali Linux is a similar tool that has emerged; it can be installed on a computer or virtual machine or run via live CD or USB.

If you find you like penetration testing, perhaps look into becoming an Ethical Hacker. Project 8: Set up a RADIUS server for enterprise Wi-Fi security

At home you likely encrypt your wireless router with the Personal or Pre-shared Key (PSK) mode of WPA or WPA2 security to keep others off the network and to prevent them from snooping on your traffic. The Personal mode is the simplest way to encrypt your Wi-Fi: Set a password on the router and simply enter it on the devices and computers you connect.

Businesses, however, should use the Enterprise mode of WPA or WPA2 that incorporates 802.1X authentication. This is much more complex than the Personal mode but provides better protection. Instead of a global Wi-Fi password, each user receives his or her own login credentials; the encryption protects against user-to-user snooping. Plus you can change or revoke individual login credentials to protect the network when an employee leaves or a device becomes or lost or stolen.

To use the enterprise mode you must have a separate Remote Authentication Dial-In User Service (RADIUS) server to handle the 802.1X authentication of users. As a network admin you’ll likely have to configure and troubleshoot clients with 802.1X authentication and help maintain the RADIUS server. For practice, consider setting up your own server and using enterprise-level Wi-Fi security on your home network.

If you’re working on a network that has a Windows Server, the Network Policy Server (NPS) or Internet Authentication Service (IAS) component can be used for the RADIUS server. But if not, you have a couple of free options. If you want some Linux experience, consider the open-source FreeRADIUS. Some easier-to-use options that include a Windows GUI are the freeware TekRADIUS and the 30-day free trials of commercial products like ClearBox. In a previous review, I evaluated these and other low-cost RADIUS servers.

Once you have the RADIUS server installed, create user accounts and input shared secrets (passwords) for the APs. Then configure the wireless router or APs with WPA/WPA2-Enterprise: Enter the RADIUS server’s IP and port, and the shared secret you defined on the RADIUS server. Then you can connect clients by entering the login credentials you defined on the RADIUS server.

Here are a few previous articles you may want to check out: 6 secrets to a successful 802.1X rollout, and Tips for troubleshooting 802.1X connections, and Lock Down Your Wi-Fi Network: 8 Tips for Small Businesses.
Project 9: Install Windows Server and set up a domain

As a network admin you’ll likely manage Microsoft-based networks running Windows Server. To gain more experience, consider running Windows Server at home.

Although purchasing a copy of a server edition isn’t feasible just for tinkering around with, there are some free options. Microsoft provides 180-day free trials via a downloadable ISO for installing on a physical machine, virtual hard drive (VHD) for running on a virtual machine, and access to a pre-configured virtual machine on the Windows Azure cloud. Plus the company offers Virtual Labs — guided tutorials in a virtual environment — that you might want to check out.

Once you get access to a server, discover and experiment. Perhaps configure Active Directory and play with Group Policies, set up Exchange and configure an Outlook client, or set up NPS for 802.1X authentication.
Next steps

If you found these projects useful in learning and getting experience, keep in mind there are many more self-taught labs out there online. Try searching for labs on the specific certifications you’re interested in or the network vendors you’d like to administer.

Though virtual environments and emulators provide a quick and easy way to get hands-on experience, also try to get as much time as you can with the real gear. Ask the IT department if you can borrow any spare equipment, and take advantage of any other chances you spot to get real-world experience.



MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

The devices connected to your router battle for bandwidth like thirst-crazed beasts jostling for access to a receding watering hole. You can’t see the melee, but you can feel its impact. Without intervention, the strongest competitors—a BitTorrent download, for instance—will drink their fill, even if it’s not essential to their survival, while others—a VoIP call, a Netflix stream, or a YouTube video—are left to wither and die.

Data integration is often underestimated and poorly implemented, taking time and resources. Yet it

A router with good Quality of Service (QoS) technology can prevent such unequal distribution of a precious resource. You can dip only one straw into the Internet at a time, after all. QoS ensures that each client gets its chance for a sip, and it also takes each client’s specific needs into account. BitTorrent? Cool your jets. If one of your packets is dropped, it’ll be resent. You can run in the background. Netflix, VoIP, YouTube? Lag results in a bad user experience. Your data gets priority.

That’s a gross oversimplification, of course. Here’s a more in-depth explanation. QoS, also known as traffic shaping, assigns priority to each device and service operating on your network and controls the amount of bandwidth each is allowed to consume based on its mission. A file transfer, such as the aforementioned BitTorrent, is a fault-tolerant process. The client and the server exchange data to verify that all the bits are delivered. If any are lost in transit, they’ll be resent until the entire package has been delivered.

That can’t happen with a video or audio stream, a VoIP call, or an online gaming session. The client can’t ask the server to resend lost bits, because any interruption in the stream results in a glitch (or lag, in terms of game play). QoS recognizes the various types of traffic moving over your network and prioritizes it accordingly. File transfers will take longer while you’re watching a video or playing a game, but you’ll be assured of a good user experience.

Traditional QoS

Different routers take different approaches to QoS. With some models, you simply identify the type of traffic you want to manage and then assign it a priority: High, medium, or low. With others, you can choose specific applications, or even identify the specific ports a service or application uses to reach the Internet. Yet another way is to assign priority to a specific device using its IP or MAC address.

Router Quality of Service QoS

Many older routers, such as this Netgear WNR2000 802.11n model, have predefined Quality of Service for a limited number of applications, but you must configure your own rules for anything the manufacturer didn’t think of.

Configuring QoS this way can be very cumbersome, requiring lots of knowledge of protocols, specific details about how your router operates, and networking in general. Some routers, for instance, depend on you to inform them of the maximum upload and download speeds your ISP supports. Enter the incorrect values, and your network might perform worse instead of better.

Fortunately, router manufacturers have made great strides in making QoS easier to configure. In some cases, it’s become entirely automatic.

Intelligent QoS

Some routers include the option of automated QoS handling. Most newer models support the Wi-Fi Multimedia (WMM) standard, for instance. WMM prioritizes network traffic in four categories, from highest to lowest: Voice, video, best effort (most traffic from apps other than voice and video), and background (print jobs, file downloads, and other traffic not sensitive to latency). WMM is good as far as it goes, but it ameliorates only wireless network contention. It does nothing to resolve the battle for bandwidth among wired network clients.

Better routers go further to cover both sides of the network. They automatically choose which traffic gets priority based upon assumptions—putting video and voice ahead of file downloads, for instance. The intelligence behind each vendor’s QoS functionality, however, varies according to the quality of the algorithm in use and the processor power available to run it.

Router Quality of Service QoS

Qualcomm’s StreamBoost technolog enables the the D-Link DGL-5500 to display exactly what’s consuming the majority of your network’s bandwidth.

Right now, Qualcomm’s StreamBoost traffic-shaping technology seems to be the hot QoS ticket. StreamBoost, first announced in January, 2013, is based on technology originally developed by Bigfoot Networks. Bigfoot, a company that Qualcomm acquired in 2011, designed network-interface cards targeted at gamers, who are among the most latency-sensitive computer users in the world.

Qualcomm doesn’t manufacture routers, but the company does design and manufacture processors that go into high-end consumer routers such as Netgear’s Nighthawk X4 and D-Link’s DGL-5500 Gaming Router. While there’s no technological barrier to running StreamBoost on a Marvel or Broadcom processor, Qualcomm currently doesn’t license the firmware separate from its chips.

StreamBoost can distinguish between and prioritize latency-sensitive traffic (audio, video, gaming, and so on) over latency-insensitive traffic (downloads, file transfers, etc.), and it can adjust its allocation of bandwidth to various network activities to ensure all clients get a good experience. If several clients are streaming Netflix videos at the same time, for instance, it can automatically reduce one or more of those streams from 1080p quality to 720p quality to ensure all the sessions have enough bandwidth.

What’s more, StreamBoost can distinguish among the types of client devices and reduce the image quality streaming to a smartphone or tablet, because the degradation won’t be as noticeable on those small screens as it would be on a big-screen smart TV.

Router Quality of Service QoS

StreamBoost lets you assign priorities to client PCs, so you can preserve bandwidth for a smart TV at the expense of a PC used for BitTorrent downloads, for instance.

StreamBoost’s bandwidth graphs and tools provide better visibility and more precise tuning than other QoS tools I’ve seen. And if you opt in to participate, you’ll receive ongoing updates from Qualcomm’s database in the cloud so that your router can continually optimize its performance and learn how to handle new devices that come on the market. StreamBoost support alone won’t make a crappy router great, but it can make a difference.

Don’t stop with QoS

Good Quality of Service is essential if you use your network to stream video, play online games, make VoIP and Skype calls, or watch YouTube (and if you don’t do any of those things, you wouldn’t have clicked on this story in the first place). The performance benefits you’ll realize might even save you from moving up to a pricier service tier with your ISP.

Linksys WRT1900AC Wi-Fi router
An 802.11ac router can deliver higher performance even with clients that are equipped with 802.11n adapters.

But there are other things you can do beyond traffic shaping. Perform a site survey using a tool such as Kismet to see which radio channels your neighbors are relying on, and configure your router to use something else. There are only three non-overlapping channels in the 2.4GHz frequency band: 1, 6, and 11. Use one of these if possible.

If you have a dual-band router that supports both the 2.4- and 5GHz frequency bands, use the less-crowded higher frequency for latency-sensitive traffic such as media streaming, and reserve 2.4GHz for things like downloads. There are many more non-overlapping channels at 5GHz, and the higher channels—150 and up—support more bandwidth than the lower channels.

Lastly, if you’re using an 802.11n (or older) router, consider moving up to a model based on the newer 802.11ac standard. Even if your clients are stuck with 802.11n adapters, you’ll still see a significant performance boost with an 802.11ac router.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Google is on track to spend more money this year attempting to influence lawmakers than any other tech company

Google and Facebook continued to pour millions of dollars into political lobbying in the third quarter in attempts to influence U.S. lawmakers and have legislation written in their favor.

Google spent $3.94 million between July and September while Facebook spent $2.45 million, according to disclosure data published Tuesday.

The only tech-related company to outspend Google was Comcast, which is trying to persuade politicians to look favorably on a merger with Time Warner and spent $4.23 million during the quarter.

But Google stands as the largest spender in the entire tech industry to date this year. It has run up a $13 million bill lobbying Washington politicians and their offices on a range of issues as diverse as online regulation of advertising, cybersecurity, patent abuse, health IT, international tax reform, wind power and drones.

If industry spending continues at its current level, 2014 will mark the fourth year that Google has spent more money on federal lobbying than any other technology company.

Facebook began lobbying Washington in 2009 and has quickly risen to become the fourth-largest spender in the tech industry so far this year, behind Google, Comcast and AT&T.

The company’s lobbying hits an equally diverse range of areas including cyber breaches, online privacy, free trade agreements, immigration reform, Department of Defense spending and intellectual property issues.

Another notable spender in the third quarter was Amazon, which plowed $1.18 million into its lobbying efforts. That represents a quarterly record for the Seattle company and is the second quarter in a row that it has spent more than $1 million on lobbying.

Amazon’s lobbying was aimed at many of the same areas targeted by Google and Facebook, but covered additional subjects close to its business, including postal reform, online wine sales, mobile payments and Internet tax payments.

The money is funneled to D.C. lobbying firms that use it to push their clients’ agendas to politicians and their staffers. The lobbying disclosure reports are published quarterly by the U.S. Senate and detail spending in general areas, but do not go into specifics.

Lobbying has long been an effective tool used by major companies, but it’s only been in the last few years that Internet companies have started spending money in amounts to rival traditional tech giants.

During the third quarter, other major spenders included Verizon ($2.91 million), CTIA ($1.95 million), Microsoft ($1.66 million) and Oracle ($1.2 million).

Apple spent just over $1 million in the quarter lobbying on issues including consumer health legislation, transportation of lithium ion batteries, international taxes, e-books, medical devices and copyright.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

10 Tips to Ensure Your IT Career Longevity

Written by admin
October 19th, 2014

Enjoying a long career doesn’t happen by accident. It takes planning and effort. Use these tips to get your head in the game and keep your eye on the future.

Many people say that IT and technology are a young man’s game, and if you look at most influential tech companies you might agree. Most IT workers employed at those companies there are under 35 and male. However, these big name firms employ only a fraction of tech professionals and there are plenty of opportunities out there for everyone. IT has one of the lowest unemployment rates of any industry because in most organizations technology touches every part of the business.

Be Responsible for Your Own Career
Achieving career longevity in the IT business takes time effort, time and resources — and nobody but you can organize, facilitate and be responsible for all of it. To stay ahead of the learning curve you need to think about your goals and architect your future.

Many organizations are getting better at providing embedded employee performance and career management processes, according to Karen Blackie, CIO of Enterprise Systems & Data for GE Capital. However, she warns that you are your own best advocate and should always strive to “own” your career. Don’t wait for your organization to do it for you because that day may never come.

This means stepping back and thinking about where you want to be in X amount of time and then outlining the different skills and experience needed to get there. With that information you can start mapping out your career. “Doing research into what interests you, setting goals and objectives and then having a plan around how you will accomplish those goals is very important,” says Blackie. Remember positions get eliminated and things don’t always work out so it’s wise to consider alternate paths.

Flexibility and Agility Required
Technology moves at an unprecedented pace, which means you’ve got to be flexible. “Adaptability is key. CIOs who can’t adapt to that change will see themselves – unfortunately – left behind in a competitive job market. But the CIOs who see each new change – whether mobile, BYOD, Cloud, IoT – as an opportunity are the technology executives who will continue to be in demand– because they’ve proven that they can leverage new solutions to drive business value, “says J.M. Auron, IT executive resume writer and president of Quantum Tech Resumes.
Learn About the Business

“Having the business knowledge is a key foundational element to one’s career, ” says GE Capital’s Blackie. Being a great developer isn’t enough if you plan to climb the corporate ladder. You’ve got to understand your industry and how your company does business. This kind of data can also help you be a better programmer. By better understanding the business needs it will help you deliver products, software and services that better align with the business.

Always Be Learning
The price of career longevity in the world of IT and technology is constant learning. If you aren’t passionate about it or you’re complacent, it’s easy to find yourself locked into outdated technology and left behind. There are many ways to stay current like a formal college environment or a certification course for example. “It is your career and it is up to you to keep educating yourself,” says Robert P. Hewes, Ph.D., senior partner with Camden Consulting Group, with oversight for leadership development and management training.

Professional organizations, conferences, developer boot camps and meet-ups are all great ways to stay abreast in the newest technologies and build network connections within your industry. “It’s often a place where you develop life-long friends and colleagues, “says Blackie.

Attend Industry Conferences

Industry conferences are great way to learn about the newest trends in technology as well as network with like-minded people who hold similar interests. Be selective about which conferences you attend and make sure you allot the necessary time to socialize and network with your peers.

“One mistake attendees often make at conferences is filling their schedule so tightly with panels that they miss out on the networking available during downtime. It’s important to attend mixers and informal gatherings at conferences to meet your peers and build relationships that could last throughout your career,” says Blackie.

Incorporate Time into Your Day for Reading
Set up a little time each day to stay current with the goings-on in your part of technology and beyond. “Become a regular reader of info in your industry, be it an industry journal or an online blog/magazine. There is a lot of information out there. Another quick way to find relevant information is via an aggregator, Pocket and LinkedIn do this,” says Hewes.

Google News and a host of other news aggregators like LinkedIn Pulse or Reddit offer a daily stream of news and with alerts and notifications that allow users to focus on key areas of interest.

Pay Attention to Competitors
“It’s important to get to know industry competitors and watch what they’re doing. You can learn a lot from the successes and failures of your competitors,” says Blackie. Being first isn’t always required to be successful. Doing it better than the next guy is, however. Find your competitors as well as organizations that you think are thought leaders in your industry and follow them in social media or create a Google Alert for them.

Find a Mentor or Coach

Mentoring is useful at all levels of one’s career. A mentor can help you negotiate internal politics or provide insight into how to solve lingering problems. You may also have different mentors throughout your career, each offering a different perspective or expertise.

Understand the Value of Social Media
Not everyone adores social media, but it’s a necessary element in the race to separate you from the rest of IT professionals. Build and maintain profiles on relevant social media sites and then use them to explain the value proposition you offer.

Work on Soft Skills and Some Not-so-Soft Ones

Branding Skills

Branding is what help separates you from the rest of the pack and explains what your value proposition is to your employer or prospective employers. “Branding is another key for career advancement – and one that few technology leaders have fully embraced. Giving thought to that brand is key for career longevity and advancement,” Auron says.

Communication Skills

According to Auron, the ability to find the right path, communicate value and build enthusiasm is a crucial step in transforming the perception of IT from that of a cost center to that of a business enabler. “The most critical skill is the ability communicates the real value of technology investment to nontechnical leadership. Some technologists can fall into one of two traps: giving so much detail that the audience’s eyes glaze over or, appearing patronizing when intelligent – but nontechnical leaders – don’t get a specific reference,” Auron says.

Project Management Skills

At some point in your technology career you will be asked to lead a project. When the time comes make sure you’ve got the necessary tools. “It is critical if you are headed onto the management track. In fact, you should try to gain wide experience with all kinds of projects,” says Hewes.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

8 headline-making POS data breaches

Written by admin
October 14th, 2014

The rash of data breaches in the US through POS terminals has many looking the Chip and PIN model used in Europe.

POS swipes
Next month marks an unceremonious anniversary of the 1960s-vintage “swipe-and-signature” magnetic stripe card system. To some the end of this point-of-sale terminal is not a moment too soon. While talks continue on implementing the “chip and PIN” system, we look back at the most recent incidents involving stealing information from these devices.

Home Depot
Data thieves were the ones following The Home Depot’s credo of “More doing,” as they did more doing with other people’s money. The Home Depot reported a data breach earlier in September. The home improvement store confirmed that 2,200 stores were compromised in U.S and Canada. The number of credit cards affected may have reached 56 million.

Target
Around 70 million customers found coal in their stockings when Target was targeted by thieves just before this past Christmas.

Michaels
The arts and craft chain reported more than 3 million credit cards affected by identity theft. This was a big one-two punch with Target’s breach occuring just a month earlier.

Beef O’Brady
Customers have a beef with the Beef O’Brady restaurants when in early September they were the victim of a data breach to the Florida chain’s point of sale system. There have been no reports yet how many cards were affected by the breach.

Dairy Queen
Dairy Queen was hit with a blizzard of credit card numbers stolen at the end of August. According to Privacyrights.org: Dairy Queen has reported a data breach of their POS (Point of Sale) system when malware authorities are calling “Backoff” was found on the system. Currently the restaurant chain is unclear as to how many stores were affected.

PF Changs
CSO’s Dave Lewis reported: On June 10th 2014 the staff at PF Chang’s received a visit that they didn’t want to have come knocking on their door. The US Secret Service came calling to alert the restaurant chain that they had been compromised by a criminal gang that was stealing data from their point of sale systems.

Shaws and Star Market
Shaw’s and Star Market have more than 50 grocery stores in Massachusetts but there has been no report on how many credit cards were ripped off.

TJ Maxx
At over 45 million, the 2007 data breach at TJ Maxx is the grand daddy of them all. According to Computerworld: In filings with the U.S. Securities and Exchange Commission yesterday, the company said 45.6 million credit and debit card numbers were stolen from one of its systems over a period of more than 18 months by an unknown number of intruders.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com