Archive for the ‘ Tech ’ Category


NSS Labs recently released the results and analysis from its latest Browser Security Comparative Analysis Report, which evaluated the ability of eight leading browsers — Apple Safari, Google Chrome, Kingsoft Liebao, Microsoft Internet Explorer, Mozilla Firefox, Opera, Qihoo 360 Safe Browser, and Sogou Explorer — to block against socially engineered malware (SEM). The use of social engineering to distribute malware continues to account for the bulk of cyber attacks against both consumers and enterprises, thereby making a browser’s ability to protect against these kinds of attacks an important criterion for personal or corporate use.

Microsoft Internet Explorer continues to outperform other browsers. With an average block rate of 99.9 percent, the highest zero-hour block rate, fastest average time to block, and highest consistency of protection over time percentages, Internet Explorer leads in all key test areas.

Google Chrome remained in the top three, but its average block rate fell significantly to 70.7 percent, down from 83.17 percent in the previous test.

Cloud-based endpoint protection (EPP) file scanning provides substantial defenses when integrated with the browser. Kingsoft Liebao browser utilizes the same cloud-based file scanning system used by Kingsoft antivirus and had the second highest overall block rate at 85.1 percent, ahead of Chrome by almost 15 percentage points.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Google’s Safe Browsing API does not provide adequate SEM protection. Apple Safari and Mozilla Firefox both utilize the Google Safe Browsing API and were the two lowest performing browsers in this latest test. Both also saw significant drops of around 6 percent in their average block rates — Safari from 10.15 percent to 4.1 percent and Firefox from 9.92 percent to 4.2 percent.

Chinese browsers tested for the first time prove viable. This year, three browsers from China were included in testing for the first time, and Kingsoft’s Liebao browser jumped ahead of Google Chrome with an overall protection rate of 85.1 percent. Sogou Explorer had the fourth highest average block rate at 60.1 percent.

Commentary: NSS Labs Research Director Randy Abrams
“Selecting a browser with robust socially engineered malware protection is one of the most critical choices consumers and enterprises can make to protect themselves. Microsoft’s SmartScreen Application Reputation technology continues to provide Internet Explorer the most effective protection against socially engineered malware,” said Randy Abrams, Research Director at NSS Labs. “This year NSS added three browsers from China. The Kingsoft Liebao browser displaced Chrome from second place by using a combination of URL filtering with the cloud-based file scanning technology that Kingsoft uses for their antivirus product. Sogou Explorer, another browser from China, was the only other tested browser to exceed 50 percent protection against socially engineered malware. Firefox and Safari failed to achieve five percent effectiveness and leave less technical users at considerable risk.”

NSS Labs recommendations
Learn to identify social engineering attacks in order to maximize protection against SEM and other social engineering attacks.
Use caution when sharing links from friends and other trusted contacts, such as banks. Waiting just one day before clicking on a link can significantly reduce risk.
Enterprises should review current security reports when selecting a browser. Do not assume the browser market is static.

The ‘always-on’ IT culture: Get used to it

Written by admin
April 9th, 2014

Around-the-clock accessibility is now expected for a broad range of IT roles. Here’s how to cope.

A couple of weeks into his job as lead QT developer at software development consultancy Opensoft, Louis Meadows heard a knock on his door sometime after midnight. On his doorstep was a colleague, cellphone and laptop in hand, ready to launch a Web session with the company CEO and a Japan-based technology partner to kick off the next project.

“It was a little bit of a surprise because I had to immediately get into the conversation, but I had no problem with it because midnight here is work time in Tokyo,” says Meadows, who adds that after more than three decades as a developer, he has accepted that being available 24/7 goes with the territory of IT. “It doesn’t bother me — it’s like living next to the train tracks. After a while, you forget the train is there.”

Not every IT professional is as accepting as Meadows of the growing demand for around-the-clock accessibility, whether the commitment is as simple as fielding emails on weekends or as extreme as attending an impromptu meeting in the middle of the night. With smartphones and Web access pretty much standard fare among business professionals, people in a broad range of IT positions — not just on-call roles like help desk technician or network administrator — are expected to be an email or text message away, even during nontraditional working hours.

The results of Computerworld’s 2014 Salary Survey confirm that the “always-on” mentality is prevalent in IT. Fifty-five percent of the 3,673 respondents said they communicate “frequently” or “very frequently” with the office in the evening, on weekends and holidays, and even when they’re on vacation.

Read the full report: Computerworld IT Salary Survey 2014

TEKsystems reported similar findings in its “Stress & Pride” survey issued last May. According to the IT services and staffing firm, 41% of those polled said they were expected to be available 24/7 while 38% said they had to be accessible only during the traditional work hours of 8 a.m. to 6 p.m. The remaining 21% fell somewhere in between.

“Being on all the time is the new normal,” says Jason Hayman, market research manager at TEKsystems. “[Bring-your-own-device] trends and flexible work arrangements have obliterated the traditional split between work and nonwork time, and IT gets hit hard.”
The reality of staying relevant

Around-the-clock accessibility is not only part of the IT job description today, it’s the reality of staying relevant in a climate where so many IT roles are outsourced overseas, according to Meadows. “Work can be done much cheaper in India, Russia or China,” he says. “So you need to be able to get things done as fast as stuff happens in other places, and many more work hours are required to make that happen. When you sign up for this job, that’s just the way it is.”
Checking in

How frequently, on average, do you check messages or communicate with your office during nonscheduled work hours such as evenings, weekends, holidays or vacation?

Being available may be part of the job, but demands can become onerous, notes Robert Sample, formerly a senior technical analyst with Cox Media Group. “When I started in the 1998 to 1999 time frame, a person would be on call for a week, and typically you might get one or two contacts during off hours,” says Sample, who is currently between jobs. “Over the last few years, the change has been toward immediate responsiveness and more active involvement.”

At Cox Media, Sample was issued a BlackBerry that pinged him with an email alert when a trouble ticket was started. “Our SLA [service-level agreement] specified a response within four hours no matter what,” he says. “That goal didn’t even consider whether it was [during] work hours.”

Many IT professionals say they’ve made a routine of frequent check-ins. It helps avert problems and makes the workday smoother, they say, since there often isn’t enough time during traditional hours to get everything done. That’s partly what motivates Merlyn Reeves to make herself available around the clock.

A project manager for a network communications provider, Reeves works from home. She says the need to coordinate with colleagues in different time zones means she might have to chair a conference call at 7 a.m. or respond to emails while watching 60 Minutes on a Sunday night. She keeps her cellphone bedside so she can respond to the occasional email at night, and she works on Sundays to get a jump-start on the week.

Reeves says she doesn’t do that because her managers expect it; rather, it’s her personal work ethic that drives her. “It’s not spoken that it’s expected, and if I didn’t respond at 8 p.m. on Sunday night, no one would chastise me,” she says. “But as a project manager, I don’t ever want to be the holdup to getting something done.”
Making 24/7 work

Work ethic aside, Reeves and other IT professionals have developed strategies for managing the “always-on” requirement in the hopes of creating a modicum of work/life balance. Reeves won’t wade in on certain email discussions during off-hours, and she’s learned to take vacation during Christmas week, when many people aren’t working, so she can unplug without the stress.

Sample has also changed the way he vacations. “I’ve started taking a cruise every year,” he says. “You get a few miles offshore, and cellphones don’t work. That way, you can take a vacation and not have to worry about problems until you get back.”

Kathy McFarland, quality assurance specialist at Vanderbilt University Medical Center, makes it very clear in her voicemail message and email signature if she’s out of the office and when and how she will respond. And like Reeves, she has gotten strategic about the emails she will and won’t answer during off-hours.

“You have to try to stop the insanity somehow,” she says. “If it’s a focused question that I can answer quickly, I will respond, and that’s OK. When it’s a flurry because there are multiple people on a thread and everyone gets whipped up, I refuse to respond.”
Long hours

How many hours per week do you work on average?
Even with those coping strategies, she admits it’s hard to unplug. “You try to turn off when you can, but if the executive steering committee wants answers, they want them when they want them,” McFarland says. “They don’t care if it’s 5 p.m. on a Friday.”

Still, there are ways to draw the line, notes Allan Harris, a cloud architect at Partners HealthCare. While Harris regularly makes himself available during off-hours, he proactively makes sure people know how and where to seek help when he’s out of the office on planned time off with his family. More often than not, people respect his time, but there are the occasional situations where someone tracks him down on his cellphone.

“If I have an out-of-office message that specifies that someone else should be contacted, and someone calls me directly, I have a problem with that,” he says. The first thing he does is triage the problem, but he also sets boundaries. “The problem is most important, but I do let the customer know that we’ll address the situation when I come back to the office, where we’ll talk about SLAs and the proper escalation procedures,” he explains.

The embrace of the bring-your-own-device trend among IT pros definitely contributes to the increase in calls during off-hours, says Harris. “When you give out your personal cell number, it’s kind of like a Batphone — people think they can get a personal response.”
Taking the good with the bad

Despite the inconveniences, IT professionals say there is an upside to the 24/7 mentality. Because people are actively working at night, in the early mornings or on weekends, there is greater flexibility to step out during the workday to run errands or spend time with the kids, especially if you can work from home.

That’s how Scott Murray, business intelligence manager at Hospital Corporation of America (HCA), sees it. Murray, who has worked from home for six years, says he regularly emails or instant-messages with colleagues late at night or in the early morning hours, and he works some weekends to create reports tied to the monthly accounting cycle.

On the flip side, Murray coaches high school soccer and is out for practice from 3:45 to 5:30 p.m. every day during the season. “I feel like that’s OK because I’m available on weekends and after work,” he says. “If I were sitting in an office, there would be an expectation that I’d be there until 5 p.m. or later, and I couldn’t do the coaching.” Additionally, Murray doesn’t go totally dark. “I still answer the phone at soccer practice,” he says. “If something goes wrong, my boss knows he can reach me.”

Establishing trust and respect helps make the “always-on” culture work for both IT employees and management, says Cynthia Hamburger, CIO/COO at Learning Ally, a nonprofit dedicated to helping people with learning disabilities. Hamburger, who has been a CIO at larger companies, including Dun & Bradstreet, says it’s important to protect people’s personal time and publicly acknowledge them when they go beyond the call of duty. But respecting personal time doesn’t necessarily mean that weekends are off-limits.

“If you are on vacation with the family, unless the house is burning down, we will not contact you,” she says. But for those who aren’t taking paid time off, “there is an ‘always available’ mentality. It goes with an IT role and, unfortunately, the digitalization of the planet has made it worse,” she adds. “There is an expectation that most forms of contact are checked pretty regularly.”

While Hamburger says technology has made it easier for IT professionals to stay connected, she says the idea of 24/7 access is really nothing new, particularly among those interested in advancement. “People who have been the most successful in IT have had this work ethic all along,” she says. “The technology has just made us much more accessible in real time.”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

New CEO Satya Nadella comes out swinging on ‘cloud first, mobile first’ strategy

As expected, Microsoft CEO Satya Nadella today hosted a press conference where the company unveiled Office for iPad, breaking with its past practice of protecting Windows by first launching software on its own operating system.

CEO Satya Nadella expounded on Microsoft’s ‘cloud first, mobile first’ strategy today as his company unveiled Office for iPad as proof of its new platform-agnosticism.

Three all-touch core apps — Word, Excel and PowerPoint — have been seeded to Apple’s App Store and are available now.

The sales model for the new apps is different than past Microsoft efforts. The Office apps can be used by anyone free of charge to view documents and present slideshows. But to create new content or documents, or edit existing ones, customers must have an active subscription to Office 365.

+ ALSO ON NETWORK WORLD Trial Microsoft software and services — for free +

Microsoft labeled it a “freemium” business model, the term used for free apps that generate revenue by in-app purchases.

Today’s announcement put an end to years of speculation about whether, and if so when, the company would trash its strategy of linking the suite with Windows in an effort to bolster the latter’s chances on tablets. It also reversed the path that ex-CEO Steve Ballmer laid out last October, when for the first time he acknowledged an edition for the iPad but said it would appear only after a true touch-enabled version had launched for Windows tablets.

It also marked the first time in memory that Microsoft dealt a major product to an OS rival of its own Windows.

“Microsoft is giving users what they want,” Carolina Milanesi, strategic insight director of Kantar Worldpanel ComTech, said in an interview, referring to long-made customer demands that they be able to run Office on any of the devices they owned, even those running a Windows rival OS. “The connection to Office 365 was also interesting in that this puts users within Microsoft’s ecosystem at some point.”

Prior to today, Microsoft had released minimalist editions of Office, dubbed “Office Mobile,” for the iPhone and Android smartphones in June and July 2013, respectively. Originally, the iPhone and Android Office Mobile apps required an Office 365 subscription; as of today, they were turned into free apps for home use, although an Office 365 plan is still needed for commercial use.

Talk of Office on the iPad first heated up in December 2011, when the now-defunct The Daily reported Microsoft was working on the suite, and added that the software would be priced at $10 per app. Two months later, the same publication claimed it had seen a prototype and that Office was only weeks from release.

That talk continued, on and off, for more than two years, but Microsoft stuck to its Windows-first strategy. Analysts who dissected Microsoft’s moves believed that the company refused to support the iPad in the hope that Office would jumpstart sales of Windows-powered tablets.

Office’s tie with Windows had been fiercely debated inside Microsoft, but until today, operating system-first advocates had won out. But slowing sales of Windows PCs — last year, the personal computer industry contracted by about 10% — and the continued struggles gaining meaningful ground in tablets pointed out the folly of that strategy, outsiders argued.

Some went so far as to call Windows-first a flop.

Microsoft has long hewed to that strategy: The desktop version of Office has always debuted on Windows, for example, with a refresh for Apple’s OS X arriving months or even more than a year later.

Microsoft today added free Word, Excel and PowerPoint apps for the iPad to the existing OneNote.

On his first day on the job, however, Nadella hinted at change when he said Microsoft’s mission was to be “cloud first, mobile first,” a signal, said analysts, that he understood the importance of pushing the company’s software and services onto as many platforms as possible.

Nadella elaborated on that today, saying that the “cloud first, mobile first” strategy will “drive everything we talk about today, and going forward. We will empower people to be productive and do more on all their devices. We will provide the applications and services that empower every user — that’s Job One.”

Like Office Mobile on iOS and Android, Office for iPad was tied to Microsoft’s software-by-subscription Office 365.

Although the new Word, Excel and PowerPoint apps can be used free of charge to view documents and spreadsheets, and present PowerPoint slideshows, they allow document creation and editing only if the user has an active Office 365 subscription. Those subscriptions range from the consumer-grade $70-per-year Office 365 Personal to a blizzard of business plans starting at $150 per user per year and climbing to $264 per user per year.

Moorhead applauded the licensing model. “It’s very simple. Unlike pages of requirements that I’m used to seeing from Microsoft to use their products, if you have Office 365, you can use Office for iPad. That’s it,” Moorhead said.

He also thought that the freemium approach to Office for iPad is the right move. “They’ve just pretty much guaranteed that if you’re presenting on an iPad you will be using their apps,” said Moorhead of PowerPoint.

Moorhead cited the fidelity claims made by Julie White, a general manager for the Office technical marketing team, who spent about half the event’s time demonstrating Office for iPad and other software, as another huge advantage for Microsoft. “They’re saying 100% document compatibility [with Office on other platforms], so you won’t have to convert a presentation to a PDF,” Moorhead added.

Document fidelity issues have plagued Office competitors for decades, and even the best of today’s alternatives cannot always display the exact formatting of an Office-generated document, spreadsheet or presentation.

Both Milanesi and Moorhead were also impressed by the strategy that Nadella outlined, which went beyond the immediate launch of Office for iPad.

“I think [Satya Nadella] did a great job today,” said Milanesi. “For the first time I actually see a strategy [emphasis in original].

“Clearly there’s more to come,” Milanesi said. “It was almost as if Office on iPad was not really that important, but they just wanted to get [its release] out of way so they could show that there’s more they bring to the plate.”

That “more” Milanesi referred to included talk by Nadella and White of new enterprise-grade, multiple-device management software, the Microsoft Enterprise Mobility Suite (EMS).

“With the management suite and Office 365 and single sign-on for developers, Microsoft is really doing something that others cannot do,” Milanesi said. “They made it clear that Microsoft wants to be [enterprises'] key partner going forward.”

Moorhead strongly agreed. “The extension of the devices and services strategy to pull together these disparate technologies, including mobile, managing those devices, authenticating users for services, is something Microsoft can win with. It’s a good strategy,” Moorhead said.

“This was the proof point of delivering on the devices and services strategy,” Moorhead concluded. “And that strategy is definitely paying off.”

Office for iPad can be downloaded from Apple’s App Store. The three apps range in size from 215MB (for PowerPoint) to 259MB (for Word), and require iOS 7 or later.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

There are ways around it, but upgrading may be simpler, cheaper

When Microsoft stops supporting Windows XP next month businesses that have to comply with payment card industry (PCI) data security standards as well as health care and financial standards may find themselves out of compliance unless they call in some creative fixes, experts say.

Strictly interpreted, the PCI Security Standards Council requires that all software have the latest vendor-supplied security patches installed, so when Microsoft stops issuing security patches April 8, businesses processing credit cards on machines using XP should fall out of PCI compliance, says Dan Collins, president of 360advanced, which performs security audits for businesses.

But that black and white interpretation is tempered by provisions that allow for compensating controls – supplementary procedures and technology that helps make up for whatever vulnerabilities an unsupported operating system introduces, he says.

These can include monthly or quarterly reviews of overall security, use of software to monitor file integrity and rebooting each XP machine every day in order to restore it to a known safe state, says Mark Akins, CEO of 1st Secure IT, which also performs compliance audits. That safe state can be reset using a Microsoft tool called SteadyState that was built for XP but not later versions of Windows.

“Risk is the factor,” he says, and mitigating it is the goal, but the mitigations must reduce risk just as effectively as the original regulatory requirement that is not being met. To some extent that is a subjective call, and depending on the auditor businesses may have more or less flexibility in what compensating controls are deemed OK, says Akins.

Health Insurance Portability and Accountability Act (HIPAA) and Sarbanes-Oxley (SOX) financial regulations have provisions similar to those in the PCI standard, says Collins. In fact, PCI provisions are pretty much the baseline for the other two, which have some additional requirements tacked on, he says. So the issue goes well beyond businesses that handle credit cards.

These workarounds may sound good to businesses that haven’t upgraded to Windows 7 or 8/8.1 yet, Akins says, but it’s not likely to save any time, effort or money. “For IT it’s easier to upgrade to Windows 7 or 8 versus implementing file integrity monitoring and installing SteadyState,” he says.

Compensating controls can place a big load on IT departments because, for example, updating anti-virus software daily or constantly monitoring for file integrity or for evidence of intrusions, Collins says, isn’t simple. “It’s an arduous task,” he says.

“Compensating controls should be as short-term as possible,” and used only in order to keep key business applications running. Some legacy or proprietary business-critical software runs best or only runs on Windows XP, he says, and there are no feasible alternatives yet. “It’s a major issue if the software deployed is unstable on newer versions of Windows.”

That situation leaves a choice. The first option is to migrate from Windows XP or implement compensating controls. The second is buying replacement apps or rewriting old ones so they perform well on Windows 7 or 8/8.1. Another option businesses have is to pay Microsoft for extending XP support – also costly, but something that can buy time until a better solution is in place.

Some merchants that should comply with PCI could fly under the radar for a while without doing anything to address Windows XP non-compliance, he says. While it’s not advisable, they are not compelled to have security audits unless a merchant bank or credit processing service provider requires it – and that doesn’t happen all the time, Collins says.

PCI doesn’t require all businesses to meet the updated operating system requirement. If credit card data is collected by a business, encrypted using keys that are not in control of that business and passed off to a separate entity for processing and storage, the collecting business doesn’t have to comply with the requirement to a fully patched and supported operating system, Akins says.

Still, the best option is to upgrade, Collins says. “It’s difficult to envision a case where the cost of upgrading is greater than the cost of compensating controls,” he says.


Cisco CCNA Training, Cisco CCNA Certification

Best Isaca CRISC Certification, Isaca CISM Exams Training at certkingdom.com

 

Another executive shakeup at Microsoft

Written by admin
March 4th, 2014

Rumor: Biz development head Bates, marketing chief Reller call it quits

Just a month after Satya Nadella took over as Microsoft CEO the executive inner circle is being overhauled, with two key leaders leaving the company and a third assuming significant new power.

Executive vice presidents Tony Bates, the former CEO of Skype, and Tami Reller, who cut her teeth on Windows, are leaving the company, according to a post by Kara Swisher on re/code.

Bates had reportedly been a top contender for CEO and was serving as head of business development and evangelism. Reller was head of marketing.

Bates’ job will be filled temporarily by Executive Vice President Eric Rudder, who is in charge of advanced strategy, according to the report.

Reller’s job is being expanded and filled by Chris Capossa, a Microsoft marketing executive who will now be executive vice president of both marketing and advertising, the report says.

Both Bates and Reller were in ambiguous jobs under a reorganization put in place last year by outgoing CEO Steve Ballmer.

Reller was named executive vice president of marketing under that new management scheme, but Reller essentially had to share the job with Mark Penn, another executive vice president, who “will take a broad view of marketing strategy and will lead with Tami the newly centralized advertising and media functions.”

Similarly, Bates had uncertain duties and power in dealing with manufacturing partners. Under the Ballmer reorganization, “OEM will remain in [the sales marketing and services group] with Kevin Turner with a dotted line to Tony who will work closely with Nick Parker on key OEM relationships.” At best he had fragmented authority.

Bates came onboard at Microsoft when the company bought Skype for $8.5 billion in 2011. Reller was brought into Microsoft when it bought Great Plains Software in 2001. Earlier she was both the chief financial officer and the chief marketing officer for Microsoft’s Windows division, which was moved into the operating systems division under Ballmer’s reorganization. She assumed her role as executive vice president when Ballmer reorganized.

News of this latest shakeup comes just a week after Nadella cleared room at the top for Stephen Elop, the former CEO of Nokia who is joining the company as an executive vice president in charge of devices and studios when Microsoft’s purchase of Nokia is finalized.

That means the current occupant of the slot, Julie Larson-Green, will move over and down to the newly created position of chief experience officer (CXO) in which she will report to another executive vice president Qi Lu, who is in charge of applications (Office, SharePoint, Yammer, Lync, Skype) and services (Bing and MSN).

According to an email Larson-Green sent to her staff and published by Mary Jo Foley in her All About Microsoft blog Elop is scheduled to step into his new role immediately once Microsoft’s purchase of Nokia’s phone business is complete. Meanwhile, Larson-Green will continue her current role.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

7 Technology Job Boards to Find or Fill Positions

Written by admin
February 27th, 2014

Job boards are an important element of a well-rounded job search for workers or employee search for recruiters and hiring managers, but with so many options where should you start? This guide is a good place. We look at job boards that focus squarely on technology and IT jobs.

Technology Job Boards
Job hunting is hard work. It’s either your full-time job or you’re finding time after working a full day and on weekends. It’s important to prioritize and use a multi-pronged attack. Job boards are one part of that equation.

For employers, good IT professionals are hard to find and ones that are specialized are even harder to find. Niche IT and technology job boards can help you find skilled talent in less time. So whether you’re looking for a new job or a new employee, niche technology job boards can help you find top talent and job offers that you may not find in other places.

CIO.com’s IT Jobs
If you’re a CIO or senior IT executive and in the market for you next gig, CIO.com’s IT jobs board is good place to start your search. (Before you start thinking about bias the slides are in alphabetical order.)

This job board specializes in tech jobs at the highest levels. At the time of this article, there were more than 1,000 listings for IT pros and executives.
Listing Cost:
$295 for 60 days

Crunch Board
IT job seekers would do well to check out TechCrunch’s CrunchBoard. This niche site offers technology-related job listings as well as editorial from the TechCrunch Network.
Listing Cost:
$200 – One Job Posting (30 days)
$895 – 5 Pack of Job Postings
$1495 – 10 Pack of Job Postings

Dice
Dice is one of the best-known tech-centric job boards around. Communities within Dice are specialized to skills or interests, which can make it easy for an employer to find specialized job candidates. It has a variety of listings from straight-up job posts to full-service recruiting packages.
Listing Cost:
Job Posting Express option starts at $395 for 30 days

iCrunchData
If analytics and big data are where your skills lie then iCrunchData may be just what you’re looking for. Here you’ll find tech-centric jobs that focus specifically on big data, analytics and also tech jobs in general.
Listing Cost:
iCrunchData offers job posting credits starting at $375 for one credit with a price break for each additional credit. The credits don’t expire and if those prices don’t work you can bid your own. Aside from that it offers an unlimited job packages for hiring managers starting at $595.

ITJobPro
IT Job Pro is a portal that, as the title implies, delivers technology job listings to job seekers. It offers jobs in the U.S., Europe, Asia, Australia and New Zealand.
Listing Cost:
$120 per job

Venture Beat
VentureBeat’s job board offers another place for IT job seekers to look for opportunities. It has a small database of jobs that are mainly IT or technology-based.
Listing Cost:
$99 for 30 Days

We Work Remotely
As the name implies, We Work Remotely, focuses on jobs that allow employees to telecommute. While not technically a technology board, the nature of the telecommuting angle lends itself well to the IT job market. A simple glance at the home page shows listings dominated by IT and development positions.
Listing Cost:
$200 for 30 Days

This list is by no means exhaustive. There are many other job boards out there (e.g., LinkedIn, Indeed, CareerBuilder or Monster) that cover a wide spectrum of jobs. Which sites did you use to find your last IT job?


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Gigabit Internet Service Providers Challenge Traditional ISPs

Last fall, the New America Foundation’s Open Technology Institute published a study examining high-speed Internet prices around the world. Compared to its international neighbors, the bulk of the United States pays higher prices for slower services than the majority of the planet.

Internet access itself is only part of that cost. Comcast and AT&T, for example, charge monthly fees for modems, routers and additional wiring (along with cable boxes, remotes and recording equipment for cable customers), which almost doubles the advertised monthly cost. Internet service providers can do that because, apart from a handful of spots in America, the competition is severely limited, if not nonexistent.

That could change, experts say – and Google’s high-speed, low cost, gigabit Internet service deserves the credit.

Google Fiber By the Numbers: Better, Faster, Cheaper
Google Fiber costs $70 per month, or $120 per month for an Internet/TV bundle, with installation fees up to $30 installation. Google says it’s 100 times faster than the average cable Internet connection. Compared to what other services list on their websites, Google Fiber is 22 times faster than AT&T’s best offering, 10 times faster than Comcast’s and 3.3 times faster than Verizon’s top choice – and it costs 24 times less than AT&T, 15 times less than Comcast and 10 times less than Verizon.

Google Fiber is even cheaper than ISPs’ slowest connection options. Comcast’s slowest plan, at 6 Mbps, costs $8.33 per megabit, while AT&T’s comparable package is $7.67 per megabit! Google’s gigabit plan is 167 times faster but cost only $.07 per megabit.

What’s more, Comcast Executive Vice President David L. Cohen has argued that Americans don’t need high speed Internet because they can’t handle it. (Time Warner agrees.) According to Cohen, even if Comcast could deliver gigabit service like Google Fiber’s 1 Gbps), most customers couldn’t access those speeds because of insufficient equipment. (He neglected to mention that Google provides its high-speed compatible equipment to all customers at no additional costs, along with a free Nexus 7 tablet, unlike the additional monthly fees Comcast charges for its equipment.)

Forrester communications and networking analyst Dan Bieler says Google Fiber increases Google’s leverage in negotiations with carriers regarding connectivity provisioning. Clearly, the carriers and cable providers want to retain a major role in the connectivity provisioning. If Google builds its own networks to the home and business users, carriers risk losing customers to Google.

“Google Fiber has forced the competition to take a closer look at the need to roll out ‘real’ broadband at a reasonable price,” Bieler says. This will happen in areas with “high purchasing power and a high business density, but it’s less likely in rural areas, where fiber investments aren’t always as easy to justify. “Competition for fiber will increase,” Bieler says, “but not everywhere.”

Ian Keene, research analyst and vice president at Gartner, agrees: “High bandwidths of 100 Mbps and above will only be available in the large cities for the foreseeable future.”

Telecommunications firms and cable multiple-system operators (MSOs) are competing to get fiber closer to subscribers, Keene says. Telcos have mixed feelings about fiber into the home. Some bring fiber closer, using existing copper to provide broadband, since the emerging G.fast copper standard can deliver 500 Mbps services. Others swallowing the capital expenses needed to install new cables and equipment in the home. Finally, along with competition, government broadband initiatives are driving improved services, he says

Gigabit Internet Arriving, Slowly But Surely
Google Fiber – installed throughout Kansas City, Kan. and Kansas City, Mo., with Austin, Texas and Provo, Utah next on the list – isn’t the only gigabit Internet provider in the United States. The ranks vary, too, from ISPs to electric companies to municipal governments, all offering services for a fraction of the cost of cable. This suggests that competition is coming from all corners.

Chattanooga, Tenn., can thank its electric company, EPB, for its 9-county service area. EPB needed its systems to monitor and communicate with new digital equipment – but the nation’s biggest phone and cable companies said they couldn’t do it for another decade or more. So EPB became the sole ISP for Chattanooga, also referred to as Gig City, and now manages 8,000 miles of fiber for 56,000 commercial and residential Internet customers. The service costs about $70 a month (compared to $300 a month before EPB stepped in).

In addition, the Vermont Telephone Co. has brought gigabit Internet to Burlington, the state’s largest city, and Springfield, the town where it’s headquartered. CTO Justin M. Robinson says “it’s certainly not without concern” being among a handful of companies providing gigabit Internet, “but we like to think what we are doing on a small scale here in Vermont could be replicated in a thousand different places across the country or, perhaps, even expanded to become a nationwide goal.”

Vermont Telephone’s gigabit Internet rollout is part of a larger project, funded in part by the federal Broadband Initiatives Program, that’s also upgrading the state’s voice telephone switch, adding an IPTV video head-end and deploying a 4G/LTE wireless network to most of the state, Robinson says.

According to Robinson, the goal is to build fiber to all 17,500 Vermont Telephone customers. Approximately 3,500 homes and businesses have been converted so far, with broadband penetration for those converted exceeding 80 percent. The IPTV video service, built using the former Microsoft Media Room platform, which Ericsson recently acquired, is in a trial phase.

One of the most compelling reasons for the gigabit Internet rollout, Robinson says, was the realization that significantly higher throughput has only a minor effect on total usage but still improves customers’ experience.

“They can access data more quickly and perform multiple tasks at once,” Robinson says. “My wife can watch a movie on Netflix and browse Reddit while, at the same time, I remotely connect to the office … listen to streaming music from Pandora and download the latest [game] from Steam in the background.

“At GigE speeds,” Robinson continues, “the worry about bandwidth disappears. The bandwidth is always available and waiting for the customer, instead of the customer waiting for the bandwidth.”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Microsoft Lync to play nice with Cisco, Android

Written by admin
February 19th, 2014

A number of common IT projects that seem like they should add value rarely do. Here are what I consider the top IT projects that waste budget dollars.

The role of the CIO has changed more in the past five years than any other position in the business world. Success for the CIO used to be based on bits and bytes, and is now measured by business metrics. Today’s CIO needs to think of IT more strategically and focus on projects that lower cost, improve productivity, or both, ideally.

However, many IT projects seem to be a waste of time and money. It’s certainly not intentional, but a number of projects that seem like they should add value rarely do. Here are what I consider the top IT projects that waste budget dollars.

Over provisioning or adding more bandwidth
Managing the performance of applications that are highly network-dependent has always been a challenge. If applications are performing poorly, the easy thing to do is just add more bandwidth. Seems logical. However, bandwidth is rarely actually the problem, and the net result is usually a more expensive network with the same performance problems. Instead of adding bandwidth, network managers should analyze the traffic and optimize the network for the bandwidth-intensive applications.

Investing in fault management tools
On paper, it makes sense to invest in fault management. You deploy network devices, servers, security products, and other infrastructure, so of course you would want to know when devices are up and down. However, the fact is that today we build our network so redundant that the loss of any single device has little impact on the performance of applications. Also, most of the fault management tools have a big blind spot when it comes to virtual resources, as the tools were designed to monitor physical infrastructure. IT organizations should focus on performance solutions that can isolate what’s been “too wrong for too long” to solve those nagging “brown outs” that cause user frustration.

Focusing IT energy only on the “top talkers”
When I talk to IT leaders about new initiatives, it seems much of the focus is on the top 5 or 10 applications, which makes some sense conceptually as these are the apps that the majority of workers use. Instead, IT leaders should monitor all applications and correlate usage to business outcomes to determine and refine best practices. For example, a successful branch office could be heavy users of LinkedIn, Salesforce.com and Twitter. In aggregate, these might not be among the company’s top 10 applications, and the usage would fly under the radar. If organizations could monitor applications and then link consistent success to specific usage patterns, unknown best practices can be discovered and mapped across the entire user population.

Using mean time to repair (MTTR) to measure IT resolution success
ZK Research studies have revealed a few interesting data points when it comes to solving issues. First, 75% of problems are actually identified by the end user instead of the IT department. Also, 90% of the time taken to solve problems is actually spent identifying where the problem is. This is one of the reasons I’m a big fan of tools that can separate application and network visibility to laser in on where exactly a problem is. This minimizes “resolution ping pong,” where trouble tickets are bounced around IT groups, and enables IT to start fixing the problem faster. If you want to cut the MTTR, focus on identification instead of repair, as that will provide the best bang for the buck.

Managing capacity reactively
Most organizations increase the capacity of servers, storage or the network in a reactive mode. Don’t get me wrong, I know most companies try to be proactive. However, without granular visibility, “proactive” often refers to reacting to the first sign of problems, but that’s often too late. Instead, IT departments should understand how to establish baselines and monitor how applications deviate from the norm to predict when a problem is going to occur. For example, a baseline could be established to understand the “normal” performance of a business application. Over four successive months, the trend could be a slight degrade of the application’s performance month after month. No users are complaining yet, but the trend is clear, and if nothing is done, there will be user problems. Based on this, IT can make appropriate changes to the infrastructure to ensure users aren’t impacted.

The IT environment continues to get more complex as we make things more virtual, cloud-driven or mobile. It’s time for IT to rethink the way it operates and leverage the network to provide the necessary visibility to stop wasting money on the things that don’t matter and start focusing on issues that do.


Cisco CCNA Training, Cisco CCNA Certification

Best CCNA Training and CCNA Certification and more Cisco exams log in to Certkingdom.com

 

 

So Long IT Specialist, Hello Full-Stack Engineer

Written by admin
February 17th, 2014

At GE Capital, the business is focused not simply on providing financial services to mid-market companies but also selling the company’s industrial expertise. They might help franchisees figure how to reduce power consumption or aid aircraft companies with their operational problems. “It’s our big differentiator,” says GE Capital CTO Eric Reed. “It makes us sticky.”

And within IT, Reed is looking not for the best Java programmer in the world or an ace C# developer. He wants IT professionals who know about the network and DevOps, business logic and user experience, coding and APIs.

IT Specialist Out, Full-Stack Engineer In
It’s a shift for the IT group prompted by an exponential increase in the pace of business and technology change. “The market is changing so much faster than it was just two or five or, certainly, 10 years ago,” says Reed. “That changes the way we think about delivering solutions to the business and how we invest in the near- and long-term. We have to think about how we move quickly. How we try things and iterate fast.”

But agility is a tall order when supporting a $44.1 billion company with more than 60,000 employees in 65 countries around the world. “There are several markets we play in, and we can’t be big and slow,” says Reed. “But the question is how to we make ourselves agile as a company our size.”

Like many traditional IT organizations, GE Captial had one group that developed and managed applications and another that designed and managed infrastructure. Over time, both groups had done a great deal of outsourcing. It wasn’t an organizational structure designed for speed.

An engineer by training, Reed saw an opportunity to apply the new product introduction (NPI) process developed at GE a couple of decades ago to the world of IT development. Years ago, a GE engineer might split his or her time between supporting a plant, providing customer service, and developing a new product. With NPI, we turned that on its ear and said you’re going to focus only on this new product,” explains Reed. “You take people with different areas of expertise and you give them one focus.”

That’s what Reed did with IT. “We take folks that might do five different things in the course of the day and focus them on one task — with the added twist being that you can’t be someone who just writes code,” says Reed.

A New Type of IT Team Forms
Last year, Reed pulled together the first such team to develop a mobile fleet management system for GE Capital’s Nordic region. He assembled a diverse group of 20, who had previously specialized in networking, computing, storage, application, or middleware, to work together virtually. He convinced all of the company’s CIOs to share their employees. They remained in their initial locations with their existing reporting relationships, but for six months all of their other duties were stripped away. “The CIOs had to get their heads around that,” Reed says

The team was given some quick training in automation and given three tasks: develop the application quickly, figure out how to automate the infrastructure, and figure out how to automate more of the application deployment and testing in order to marry DevOps with continuous application delivery.

There were no rules — or roles. “We threw them together and said, ‘You figure it out,’” Reed recalls. “We found some people knew a lot more than their roles indicated, and the lines began blurring between responsibilities.” Some folks were strong in certain areas and shared their expertise with others. Traditional infrastructure professionals had some middleware and coding understanding. “They didn’t have to be experts in everything, but they had a working knowledge,” Reed says.

The biggest challenge was learning to be comfortable with making mistakes. “GE has built a reputation around execution,” says Reed. “My boss [global CIO of GE Capital] and I had to figure out how to foster an environment were people take risks even though it might not work out.”

Project Success
The project not only proceeded quickly — the application was delivered within several months — it established some new IT processes. They increased the amount of automation possible not only at the infrastructure level, but within the application layer at well. They also aimed for 60 to 70 percent reusability in developing the application, creating “lego-like” building blocks that can be recycled for future projects.

Business customers welcomed the new approach. In the past, “they would shoehorn as many requirements into the initial spec as possible because they didn’t know when they’d ever have the chance again,” says Reed. “Now it’s a more agile process.” The team launches a minimum viable solution and delivers new features over time.

For IT, “it was a radical change in thinking,” says Reed. “We’ve operated the same way literally for decades. There were moments of sheer terror.” And it wasn’t for everyone. Some opted out of the project and went back to their day jobs.

But Reed is eager to apply the process to future projects and rethink the way some legacy systems are built and managed. “We had talked about services-oriented architecture, and now we have something tangible that shows it can be done,” Reed says. “On the legacy side, we have to decide if we want to automate more of that infrastructure and keep application development the old way or invest in this.”

Some employees remained with the fleet management app team. Others started a new project. And a few went back to their original roles. “We’re trying to make disciples so more people can learn about this process,” Reed says.

Reed can envision the IT organization changing eventually. “What we look for in people when we hire them will change. There were years when we went out in search of very technical people. Then there were years of outsourcing where we sought people who could manage vendors and projects,” Reed says. “Now we need both, and we need to figure out how to keep them incentivized.”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

7 Reasons Not to Use Open Source Software

Written by admin
February 13th, 2014

Talk to an open source evangelist and chances are he or she will tell you that software developed using the open source model is the only way to go.

The benefits of open source software are many, varied and, by now, well-known. It’s free to use. You can customize it as much as you want. Having many sets of eyes on the source code means security problems can be spotted quickly. Anyone can fix bugs; you’re not reliant on a vendor. You’re not locked in to proprietary standards. Finally, you’re not left with an orphaned product if the vendor goes out of business or simply decides that the product is no longer profitable.

However, the open-source evangelist probably won’t tell you that, despite all these very real benefits, there are times when using closed-sourced, proprietary software actually makes far more business sense.

Here are some of the circumstances when old-fashioned proprietary products are a better business choice than open source software.

1. When It’s Easier for Unskilled Users
Linux has made a huge impact on the server market, but the same can’t be said for the desktop market – and for good reason. Despite making strides in the last several years, it’s still tricky for the uninitiated to use, and the user interfaces of the various distributions remain far inferior to those of Windows or Mac OS X.

While Linux very well may be technically superior to these proprietary operating systems, its weaknesses mean that most users will find it more difficult and less appealing to work with. That means lower productivity, which will likely cost far more than purchasing a proprietary operating system with which your staff is familiar.

2. When It’s the De Facto Standard
Most knowledge workers are familiar with, and use, Microsoft Word and Excel. Even though there are some excellent open source alternatives to Office, such as LibreOffice and Apache OpenOffice, they aren’t identical in terms of functionality or user interface, performance, plugins and APIs for integration with third-party products. They are probably close enough as much as 90 percent of the time, but on rare occasions there’s a risk that these differences will cause problems – especially when exchanging documents with suppliers or customers.

It also makes sense to use proprietary software in specialist fields where vendors are likely to have gone into universities and trained students on their software. “The software may not necessarily be better, but it may be selected by a university before an open source solution gets a big enough community around it,” says Chris Mattman, an Apache Software Foundation member and a senior computer scientist at the NASA Jet Propulsion Laboratory.

“When that happens, the students will then know the software better and be more productive with it,” Mattman says. When the students then move into a business environment, it makes sense for them to continue with the software they are used to.

3. When Proprietary Software Offers Better Support
Business-class support is sometimes available for open source software, either from the company leading the project or a separate third-party. This isn’t the case often, though – and that can be a problem, according to Tony Wasserman, professor of software management practice at Carnegie Mellon University.

“Some customers prefer to have someone outside the company to call for product support on a 24/7 basis and are willing to pay for a service level agreement that will provide a timely response,” he says. “People often respond very quickly to queries posted on the forum pages of widely-used open source projects, but that’s not the same thing as a guaranteed vendor response in response to a toll-free telephone call.”

4. When You Want Software as a Service
Cloud software is slightly different than conventional software. As a general rule, you don’t get access to the source code, even if the hosted software is built entirely on open source software. That may not make the software proprietary, strictly speaking, but it doesn’t give you all the benefits of open source. In that sense, the benefits of using the “pay for what you use” software as a service model may outweigh the disadvantage of not having access to the source code.

5. When Proprietary Software Works Better With Your Hardware
Many types of proprietary hardware require specialized drivers; these are often closed source and available only from the equipment manufacturer. Even when an open source driver exists, it may not be the best choice. “Open source developers may not be able to ‘see’ the hardware, so the proprietary driver may well work better,” Mattman says.

6. When Warranties and Liability Indemnity Matter
Some open source software companies, such as Red Hat, are structured to look like proprietary software vendors. They accordingly offer warranties and liability indemnity for their products, just like proprietary vendors do. “These companies are exactly the same as proprietary software companies, except that they won’t take you out to play golf,” Wasserman says.

For every Red Hat, though, there are many open source projects that aren’t backed by a commercial organization. While you may get warranties and liability from a third-party, in many cases you won’t. If that doesn’t suit you or your company’s software procurement policies, then you’re advised to find a proprietary vendor.

7. When You Need a Vendor That Will Stick Around
Yes, there’s no guarantee that a commercial software vendor will stick with a product if demand drops to such an extent that it’s no longer profitable to develop it. The company itself may even go out of business. But if an open source project is small, there’s also a danger that the person behind it may lose interest. If that happens, it may not be easy to find another open source developer to step in.

(This may be more of an argument against small open source projects than an argument for proprietary software – but at least you can look into the books of large software companies and make an informed decision as to whether they’re likely to be around in a few years to honor any commitments they give you.)

Don’t Be Too Dogmatic About Open Source Software

The lesson here: While open source software may often – and even usually – be a better choice than functionally similar proprietary offerings, it doesn’t make sense to be too dogmatic about it.

“As a practical matter, I think that many people would prefer to have everything open, especially in light of the recent revelation about the NSA spying on machines through USB chips,” Wasserman says. At the same time, though, many of those who prefer open source will make exceptions when there are no practical alternatives – not to mention their use of Mac and iOS devices … ”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

IT inferno: The nine circles of IT hell

Written by admin
February 8th, 2014

IT inferno: The nine circles of IT hell
The tech inferno is not buried deep within the earth — it’s just down the hall. Let’s take a tour

Spend enough time in the tech industry, and you’ll eventually find yourself in IT hell — one not unlike the underworld described by Dante in his “Divine Comedy.”

But here, in the data centers, conference rooms, and cubicles, the IT version of this inferno is no allegory. It is a very real test of every IT pro’s sanity and soul.

[ Bring peace to your IT department by avoiding IT turf wars. | Find out which of our eight classic IT personality types best suit your temperament by taking the InfoWorld IT personality type quiz. | Get a $50 American Express gift cheque if we publish your tech tale from the trenches. Send it to offtherecord@infoworld.com. ]

How many of us have been abandoned by our vendors to IT limbo, only to find ourselves falling victim to app dev anger when in-house developers are asked to pick up the slack? How often has stakeholder gluttony or lust for the latest and greatest left us burned on a key initiative? How many times must we be kneecapped by corporate greed, accused of heresy for arguing for (or against) things like open source? Certainly too many of us have been victimized by the denizens of fraud, vendor violence, and tech-pro treachery.
Off the Record submissions

Thankfully, as in Dante’s poetic universe, there are ways to escape the nine circles of IT hell. But IT pro beware: You may have to face your own devils to do it.

Shall we descend?
1st circle of IT hell: Limbo
Description: A pitiful morass where nothing ever gets done and change is impossible
People you meet there:Users stranded by vendors, departments shackled by software lock-in, organizations held hostage by wayward developers

There are many ways to fall into IT Limbo: When problems arise and the vendors start pointing fingers at each other; when you’re locked into crappy software with no relief in sight; when your programmers leave you stranded with nothing to do but start over from scratch.

You know you’re in Limbo when “the software guys are saying the problem is in hardware and the hardware guys are saying the problem is in software,” says Dermot Williams, managing director of Threatscape, an IT security firm based in Dublin, Ireland. “Spend eternity in this circle and you will find that, yes, it is possible for nobody to be at fault and everyone to be at fault at the same time.”

A similar thing happens when apps vendors blame the OS, and OS vendors blame the apps guys, says Bill Roth, executive vice president at data management firm LogLogic. “Oracle says it’s Red Hat’s fault, while Red Hat blames Oracle,” he says. “It’s just bad IT support on both sides.”

Michael Kaiser-Nyman, CEO of Impact Dialing, maker of autodialing software, says he used to work for a nonprofit that was locked into a donor management platform from hell.

“The software took forever to run, it only worked on Internet Explorer, it crashed several times a day, and was horribly difficult to use,” he says. “The only thing worse than using it was knowing that, just before I joined the organization, they had signed a five-year licensing agreement for the software. I wanted to kill whoever had signed it.”

Organizations also find themselves in Limbo when their developers fail to adopt standard methodologies or document their procedures, says Steven A. Lowe, CEO of Innovator LLC, a consulting and custom software development firm.

“Every project is an ordeal because they’ve made it nearly impossible to learn from experience and grow more efficient,” he says. “They spend most of their time running around in circles, tripping over deadlines, yelling at each other, and cursing their tools.”

How to escape: “When you’re digging a hole in hell, the first thing to do is stop digging and climb your way out,” says Roth. That means making sure you have the tech expertise in house to solve your own problems, going with open source to avoid vendor lock-in, and taking the time to refactor your code so you can be more efficient the next time around.

2nd circle of IT hell: Tech lust
Description: A deep cavern filled with mountains of discarded gadgets, with Golem-like creatures scrambling to reach the shiny new ones at the top
People you meet there: Just about everybody at some point

The circle of tech lust touches virtually every area of an organization. Developers who abandon serviceable tools in favor of the latest and greatest without first taking the time to understand these new frameworks and methodologies (like node.js or Scrum), thereby preventing anything from ever getting done. Managers who want hot new gizmos (like the iPad) and invent a reason why they must have them, regardless of the impact on the IT organization. Executives who become fixated on concepts they barely understand (like the cloud) and throw all of an organization’s resources behind it in the fear of falling behind the competition.

“In reality, we all visit the circle of lust now and then,” says Lowe. “The problem with tech lust is the accumulation of things. You can get so mired in ‘we can’t finish this project because a new tool just came out and we’re starting all over with it’ that nothing ever gets done.”

How to escape: It is difficult to break free from the circle of tech lust, admits Lowe. “We all love shiny new things,” he says. “But you have to know what’s good enough to get the job done, and learn how to be happy with what you have.”

3rd circle of IT hell: Stakeholder gluttony
Description: A fetid quagmire filled with insatiable business users who demand more and more features, no matter the cost
People you meet there: Demons from sales and marketing, finance, and administration

This circle is painfully familiar to anyone who’s ever attempted to develop a business application, says Threatscape’s Dermot Williams.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

12 famous passwords used through the ages

Written by admin
January 28th, 2014

Passwords seem like a recent thing, but they’ve been in use for a long time. Here are a dozen of the more memorable onesPasswords – we all have a million of them in our lives. Like them or not, you can’t escape having to use them for just about everything these days, from unlocking your mobile phone to accessing your bank account online to streaming a movie on Netflix. While the prevalence of passwords has greatly increased thanks to computers and the Internet, they’ve actually been used, in one form or another, to protect things for hundreds, and even thousands, of years. Inspired by a recent Quora thread, here are a dozen of the most famous passwords used through (mostly recent) history, in both the real and fictional worlds.

This story originally appeared on ITworld.com.

00000000
For many years during the Cold War, Minuteman nuclear missiles housed in silos in the United States required a trivial eight digit code to be launched: 00000000. U.S. nuclear missiles were required to have launch codes by presidential order in 1962, to safeguard against rogue missile launches. While many missiles weren’t outfitted with this additional level of security for years, the codes were installed on U.S.-based Minuteman missiles under the direction of Secretary of Defense Robert McNamara. Once he left office, Strategic Air Defense commanders, who resented McNamara and were concerned about being able to launch the missiles quickly, set the launch codes to all zeroes. What could go possibly go wrong?

Open Sesame
The grandaddy of all passwords is Open Sesame, which was the secret phrase that was used in the famous tale, Ali Baba and the 40 Thieves, to open a cave containing treasure belonging to a group of thieves. In the story, part of The Thousand and One Nights (also known as Arabian Nights), a collection of Arabic stories collected over the centuries, Ali Baba, a poor woodcutter, overhears the secret phrase and (tl;dr) eventually gets the treasure. The phrase, of course, is well known and has appeared all over popular culture, including Popeye, Bugs Bunny and SpongeBob.

Chuck Norris
In 2010, an anonymous Facebook engineer claimed in an an interview that, at one time, employees could log into any Facebook profile using a master password which was a variant on Chuck Norris (replacing some of the letters with symbols and numbers). She claimed she had personally used it and knew of two other employees who had used it to log in and manipulate other users’ data and were subsequently fired. While she said the password no longer worked, it didn’t really matter because it was replaced by a tool which let Facebook employees log in as another user with the click of a button – provided they had a good reason to do so.

Swordfish
In the 1932 Marx Brothers film Horse Feathers, Groucho Marx’s character, Professor Wagstaff, gains access to a speakeasy using the password Swordfish. Since then, it’s become one of the most well known (and spoofed) passwords and has been referenced over the years throughout popular culture. It’s popped up in (among many other places) Scooby Doo, Mad Men, FETCH! with Ruff Ruffman, Harry Potter, and Star Trek. There was even a hacker movie named after it, as well as a Commodore 64 video game.

Buddy
In June 2000, President Bill Clinton signed the Electronic Signatures in Global and National Commerce (E-SIGN) Act, which made electronic signatures and contracts legal in interstate and foreign commerce. Appropriately, Clinton signed the bill electronically, using a smart card that was encrypted with a private key named after his dog, Buddy. Aside from being easy to guess, the password was also rendered even less secure when Clinton shared it with those in attendance at the signing in Philadelphia. Just to be safe, he also signed the bill the old fashioned way – with a pen.

Joshua
In the classic 1983 geek film WarGames, Matthew Broderick’s character, teenage computer whiz David Lightman, hacks into what he thinks is a computer video game company to play some games. Lightman correctly guessed that the backdoor password to the system was Joshua, the name of the deceased son of the games’ programmer. It turns out that what he really hacked into was the North American Aerospace Defense Command’s (AKA NORAD) War Operation Plan Response (WOPR) computer and the game of Global Thermonuclear War that he starts almost results in World War III. Not only was the story fictional, of course, but so was WOPR, which was really made of plywood and powered for the movie by an Apple II.

Tiger
If you ever worked with an Oracle database, chances are you’ve come across the famous Scott schema, accessed with the password Tiger. This is a demonstration schema consisting of a handful of tables (e.g., EMP, DEPT), meant to illustrate some of the basic concepts of Oracle functionality. The schema was created by Bruce Scott, Oracle employee number 4, and the password was named after his daughter’s cat, Tiger. The Scott schema was installed with Oracle by default through version 8; since version 9 it’s still available for manual installation, though a number of newer sample schemas are now included.

IAcceptTheRisk
In 1981, Xerox released the Star 8010 workstation, a revolutionary computer based on the earlier Alto prototype, that was meant to be used by businesses as part of an office document management system consisting of computers connected via Ethernet. The Star introduced many fundamental interface concepts that became popular, such as a graphical user interface, a bitmapped screen and clickable icons. To perform certain administrative functions on the Star (such as a system recovery), administrators had to use a code of 911 and the password IAcceptTheRisk. The password not only made the system more secure, but also served as an ad hoc Terms of Service.

Z1ON0101
In The Matrix Reloaded, the second movie in The Matrix trilogy, released in 2000, the character Trinity is seen hacking into the computer system of a power plant. Using a real network mapping tool called Nmap, and exploiting a real SSH vulnerability, she’s able to reset the root password for the system to Z1ON0101, and ultimately take control of it. The password is a variation of Zion, the name of the last human city left on Earth in the movie after the destructive war between humans and machines. The scene won acclaim from hackers for its accuracy – while the rest of the world liked it enough to generate $742 million in ticket sales.

12345
12345 is a famous password for several reasons. First, it’s one of the most commonly chosen passwords. Second, one of the people choosing to use it was Syrian president Bashar al-Assad, who picked it as the password to his email account, which was revealed when Anonymous hacked into it in 2012. Finally, 12345 is also known to fans of the Mel Brooks classic comedy Spaceballs as being the password to the planet Druidia’s air shield – as well the code to unlock Mel Brooks’ character’s luggage.

Sher
In the first episode of the second season of the BBC series Sherlock (A Scandal in Belgravia, 2012) the famous detective tried to crack the 4-character security code on a mobile phone containing compromising photos of a member of the royal family. He finally figured out the code when he realized that the woman whose phone it was was sweet on him; the code was Sher, which went along with the text on the lock screen to spell “I AM SHER LOCKED.” Obviously, it was – in hindsight, at least – elementary.

Parc
In 1972, Xerox engineers at the newly formed Palo Alto Research Center (PARC) built a computer called the Multiple Access Xerox Computer (MAXC), which was a clone of a DEC PDP-10 time-sharing mainframe (after DEC wouldn’t sell Xerox a PDP-10). The MAXC was connected to ARPANET, one of the ancestors of the modern Internet. Guests could log into the MAXC over ARPANET using a guest account and a password of parc (or maxc, as the passwords were periodically swapped), proving that while those Xerox engineers were really smart and forward-thinking, they weren’t particularly creative.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

12 Big BYOD Predictions for 2014

Written by admin
January 22nd, 2014

If you were just getting comfortable with BYOD, brace yourself for new twists and turns. CIOs can expect more devices to enter the enterprise in consumer clothing, real security threats to emerge, new MDM options and much more in 2014.

The BYOD mega trend is racing for chaos in 2014. Can you COPE? Should you call security? What now, wearables? This year is going to be sheer madness as tablets and PCs come under the BYOD umbrella, upping the stakes – and complications and confusion. It’s going to force changes to BYOD policy. It’s going to push CIOs to retool legacy apps and systems. And it’s going to test the already-strained relationship between IT and business.

Can You COPE?
Employees don’t really want to pay for their smartphones, tablets and PCs. They just want an easy-to-use device that can be used for both work and personal stuff. It’s actually better, of course, if the company foots the bill. Last year, we saw the emergence of the “company-owned, personally enabled” model, called COPE. This year, we expect to see real-world implementations. COPE is a hybrid approach that sits between free-for-all BYOD and traditional company-owned computers that forbade personal use. For more, check out IT Learns to COPE with Mobile Devices.

Tablets and PCs Come to Work
BYOD had been mostly a smartphone play, but in the second half of 2013, Forrester Research analyst David Johnson saw more PCs and tablets falling under the BYOD policy. Johnson says he expects the trend to continue this year. A move toward more powerful, more critical BYOD tools brings a plethora of technical challenges. With BYOD PCs, Johnson says, “There’s a lot more you have to secure in order for it to be considered acceptable, particularly in a regulated environment.”

Call Security!
Hoping to derail BYOD, many CIOs played the security card — that is, telling everyone who would listen that BYOD threatened corporate assets. But CIOs may have overplayed their hand. There hasn’t been a headline-grabbing security breach, prompting one analyst to claim that BYOD security was a “non-event.” But this year BYOD will expand to tablets and PCs rich with valuable corporate data, and hackers will take dead aim. That’s why we’re predicting a BYOD security bombshell in 2014.

Bye-Bye Stipends
Remember when your company reimbursed you for home Internet? Those were the good old days. Signs already point to BYOD going the same route, especially in areas where jobs are sparse and companies aren’t under pressure to provide perks. In 2014, we might be saying goodbye to device reimbursement and monthly stipends for mobile service. Caveat: If more PCs and tablets fall under a BYOD program, however, we might see stipends increase to cover them, says Forrester’s David Johnson.

IT Strikes Back
Last year, IT had to tackle BYOD head-on or risk being cut out completely. CIOs worked feverishly to change the culture from one that throws up roadblocks to one that embraces change. Tech leaders made big strides, and IT saved itself from becoming irrelevant. However, there is still a lot of work to be done with BYOD security and policy. Many companies have gaping BYOD security risks. This year, we’ll see IT shoring up networks and systems to make them BYOD-proof.

Revenge of the Rogue Worker
As IT asserts control over BYOD this year, there’s a chance end users will revolt. After all, BYOD was started by rogue business employees who felt IT was too rigid and slow in adopting consumer tech. The power pendulum shifted dramatically to end users and away from IT. Now IT hopes the pendulum is sliding back in its favor. This brings risk of an old danger: “If you start trying to increase control on employee-owned devices, then that’s a slippery slope,” says Forrester’s David Johnson.

Microsoft Gets Its Tablets in the Game
Microsoft lost the BYOD smartphone to Apple and Android and was on the verge of a complete collapse in the tablet space. In a Forrester survey conducted in late 2013, Apple iPads led the vast majority of BYOD tablet deployments, with Android tablets making a serious run. Windows 8-based tablets were practically non-existent.

Then the survey asked about BYOD tablet deployment plans in the next 12 months — and Windows 8-based tablets led the pack. What’s behind the turnaround? A lot of factors are trending Microsoft’s way, from refresh cycles to IT regaining some control over devices. Suffice to say, “the tablet in the enterprise is theirs to lose,” says Aberdeen’s Andrew Borg.

Year of the BYOD Mandate?
Two years ago, VMware made an aggressive move with BYOD by requiring all 6,000 employees in the United States to use personal smartphones for work. Last summer, a Gartner survey of CIOs showed mandatory BYOD gaining steam, prompting Gartner to predict that half of employers will require employees to supply their own device for work purposes by 2017. Then the call for mandatory BYOD quieted down in the latter part of last year.

So will we hear the mandatory BYOD chatter starting up again in 2014? As BYOD becomes the new normal, we’re predicting a few more companies will put the onus on employees to buy and use their own smartphones for work as a condition of employment.

Mobile Device Management Mayhem
Last year was a good one for mobile device management (MDM) vendors, as companies began to realize the need to get a handle on BYOD. The MDM market has been flooded with newcomers and is evolving at a wickedly fast clip. Everything from expanding the portfolio to cover app management to dealing with innovations such as app wrapping and virtual work spaces to working with device makers such as Apple and Samsung, MDM vendors have their work cut out for them. Then there are giant tech companies getting into MDM, such as Dell and possibly BlackBerry. MDM mayhem is sure to be a part of 2014.

End of Legacy Apps
One of the great inhibitors to BYOD tablets is legacy applications that have browser dependencies on older versions of Internet Explorer or are locked up behind the firewall and not easily accessible. Employees are fed up with apps that chain them to the desktop. Instead, they want to use their shiny new iPads and Android tablets that they unwrapped over the holidays for work.

“In 2014, companies will begin to put practical programs in place, continue to accelerate their move to Software-as-a-Service wherever they can,” says Forrester’s David Johnson. “Companies will start to figure out how to modernize their core applications to be more friendly to BYOD.”

Virtual Desktop, Round 2
Not every app can be modernized quickly, especially systems of record. So how will IT serve up these apps to BYOD tablets and PCs? BYOD is already starting to breathe new life into the virtual desktop. It’s an infrastructure that is more resilient and tolerant of devices that are not well configured yet need to access systems of record. Virtual desktop infrastructure, in fact, is at the heart of Seattle Children’s Hospital’s BYOD strategy.

“We’re going to see more investment in those technologies in 2014,” Forrester’s David Johnson says.

Wearables Wreak Havoc
Sparked by Google Glass and smartwatches, wearable gadgets have become a tour de force in the tech sector. They’re not just for consumers, either. Forrester analyst J.P. Gownder predicts wearables will soon be taking the enterprise by storm, with the company-provided wearables market surpassing the consumer market within the next five years. It’s quite possibly a BYOD wearables future.

Is IT ready? Hardly. “IT is just trying to catch a breath with BYOD smartphones,” says Aberdeen’s Andrew Borg. “I don’t think they’re even remotely prepared for wearables and other smart devices that are going to attempt to get on the corporate networks and access corporate files. This can loom as a big issue in 2014 and beyond.”

 


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

14 steps to a better, faster Windows laptop

Written by admin
January 20th, 2014

After a couple of years, most laptops fall seriously behind the times. Here are several ways to upgrade your legacy laptop to meet today’s standards.

Time to upgrade
I recently explained how you can clean out your laptop and increase its efficiency. However, sometimes what you really need is an upgrade.

What follows is a step-by-step rundown on how to give a legacy notebook a new lease on life by adding more RAM and a solid state drive (SSD), as well as the latest USB, HDMI and Wi-Fi hardware. As an example, I used my three-year-old HP EliteBook 2560p laptop, which is powered by a second-generation 2.6GHz Intel Core i5 processor and includes 4GB of RAM, a 320GB hard drive and Windows 7 Professional.

(I have not upgraded my operating system, although you can always switch to Windows 8 or Linux if you choose.)

Grab your tools and let’s get started.

1. Grab your tools
The first step is to make sure you have the necessary tools and components. Everything is available online or at a good electronics store. These can vary depending on your needs, but should be good for upgrading most notebooks. First, let’s start with the tools you’ll need:
– A small Philips screwdriver (I used a number 0)
– A DVD-RW disc or USB drive (for creating startup media)
– An external hard drive (for backing up data)
– A pencil with an eraser (for pushing RAM into place)
– A small bowl to hold screws and other easily lose-able objects
– A marking pen for labeling

2a. Choose your components
The best strategy is to shop around to find the best — and the most reasonably priced — components for your particular system. The parts that I bought for revamping my EliteBook are listed below. I chose these for two reasons: Because I’m familiar with the vendors and/or products, and because they fit my system.

This list shows what I paid; prices may have changed since then (clockwise, from top):

– Cable Matters Gold Plated DisplayPort HDMI adapter ($10)
– Netgear A6100 WiFi USM Mini Adapter ($50)
– Kingston 8GB SDRAM Module model DDR31333MHz (two at $90 each — more about this on the next slide)
– Crucial M500 480GB SSD ($370)
– StarTech 2 Port ExpressCard SuperSpeed USB 3.0 Card Adapter ($32)

2b. Select the right memory
Adding RAM to a computer boosts performance considerably. The EliteBook’s 4GB of RAM was skimpy to say the least, but rather than boosting it to 8GB or 12GB, I decided to add 16GB of RAM so I could squeeze every last bit of speed out of it.

I used Kingston Technology RAM modules, but you can get the parts from a variety of sources such as Crucial, Patriot Memory, PNY or any of the dozens of others. Most vendors offer an online ordering system, where you type in the model and details about your machine. For my EliteBook, I ordered two 8GB DDR3 SDRAM modules. At $90 each, the 8GB modules aren’t cheap, but I figured the extra performance would be more than worth the price.

2c: Select your storage: Hard drive or SSD?
Instead of swapping the EliteBook’s rather slow 320GB hard drive with a larger one, I decided to replace it with a 480GB solid-state drive (SSD), which is roughly five times faster at reading and writing data, can take more abuse and uses less power than rotating media does. Since much of my current work now resides in the cloud, the storage space should be quite sufficient.

On the other hand, an SSD costs roughly five to six times more than a hard drive for a lot less storage space. For instance, the 2.5-in. Crucial M500 SSD I installed cost me $370 versus about $60 for a 500GB hard drive. So it all depends on your own needs.

3. Getting inside
After turning the machine off and removing its battery, I slid the bottom panel free. Laptops differ widely in accessibility; some feature several small hatches rather than a single removeable bottom. If that’s your case, look on the bottom panel for a chip icon or other label.

The first task will be to upgrade the RAM. After unscrewing and removing the panel, you should find the memory modules.

4. Replacing the RAM
Once you’ve found the RAM, press down on the board’s edges with a pencil eraser to release the module and remove it. Then line up the contacts of each new module with those on the motherboard. Do it one at a time at a 45-degree angle and press down until the memory board snaps into place.

When you’re done, replace the battery, turn the system on and make sure that the new RAM is working properly by right-clicking on the Computer entry in Windows Explorer and clicking on Properties. This brings up the Windows System page. In this case, the system recognizes the new memory, so we’re ready to roll.

5a. Create a start-up drive: Windows 7
Now, it’s time to upgrade the storage.

But first you need to create an external start-up drive to use after you’ve installed the new blank drive. Either a DVD or a USB drive will work. (I prefer a DVD, because I can put it away for future emergencies.)

Go to the Control Panel’s Backup and Restore page. If you’re using a blank DVD, click on “Create a system repair disc” and “Create Disc.” It takes about 10 minutes to compile and burn the disc. If you’re using an external USB drive, go to “Create system image” instead and, the the set-up window, select the drive to save it to.

5b. Create a start-up drive: Windows 8
For those using Windows 8, it’s a little more involved. Go to the Control Panel’s File History option. Then click on System Image Backup in the lower left corner. Choose whether you want the backup saved on a hard drive, DVD or a network location.

After picking the drive to back up from, confirm what you’re doing and click on Start Backup to get it going. The system will then format the drive or ready the DVD, and start the process. Near the end, you’ll get a Create System Image window that lets you make a startup disk to get the machine going with a blank drive in place.

Some vendors offer software tools to help create recovery media. Check your manual or with your vendor.

6. Back up your data
With the start-up media done, it’s time to back up the existing hard drive using an external USB hard drive. (If you have a large-enough cloud storage account, you can back everything up online, but it will likely take much longer.)

After you attach the external drive, go to Windows’ Backup and Restore page and click on “Set up backup.” Highlight the external drive as the data’s destination and select “Let me choose.” You can then set the software to copy every file.

Next, click on the data you want copied: Local Disk (C:). Click Next to start.

7. Remove the old drive
Depending on how much data you have, it could take an hour or two to move all the data to the external hard drive (it took me about an hour and 15 minutes). When it’s finished, shut the system down, flip the machine over and find the hard drive. I just left the bottom panel off during the back up; if you haven’t, remove it again.

Loosen the four screws that hold the hard drive in place and put the screws into a small bowl. Finally, slide the drive out by its plastic tab.

After that, carefully loosen the screws that attach the drive to its bracket and put the drive aside. (You might want to use it as a spare.)

8. Install the new drive
It’s time to install the new drive. After screwing the new hard drive or SSD onto the drive bracket, slide the drive into place and finish up by screwing the bracket into the notebook. You can also replace the bottom panel.

Now you have to fill the drive up again. Boot Windows from the start-up disc you created. My EliteBook boots from its optical drive if a disc is present; other systems may require that you change the BIOS settings to boot from its optical drive. For notebooks without an optical drive, use the same external drive you used to create the start-up disc.

9a. Reload your OS: Windows 7
It will take a few minutes to start the system, format the new drive and load Windows onto it from the DVD disc. Once it’s done, boot your new drive, click on “System Recovery Options,” select “System Image Recovery” and the laptop will then find the backup files on the external hard drive. Click Next to start the restoration process of moving the data to the new drive.

From here on out, it’s all automatic. It should take about an hour or so. In other words, it’s time for a coffee break.

9b. Reload your OS: Windows 8
To restore from a backup using Windows 8, start by rebooting the computer while holding the shift key. This will bring up the troubleshooting window.

Click on Advanced options and then on System Image Recovery to start the process. After picking the backup image you want to use (the one that was just made), click Finish and the system will begin to copy your files onto the new drive.

If you want to upgrade to Windows 8 from Windows 7, it’s fairly simple: Instead of using your startup DVD, put the upgrade disc into the drive, select everything that you want to move (Windows settings, personal files and apps) to the new OS and click Next. Plan on it taking a couple of hours.

10. Add USB 3.0
When my EliteBook came out three years ago, USB 3.0 was a luxury, but now even budget machines use this faster standard. Happily, the system has an ExpressCard slot that can take a USB 3.0 adapter card. (Unfortunately, a regular PC Card slot isn’t fast enough to keep up with USB 3.0.)

There are a variety of cards available from vendors like Sonnet and Sabrent; I opted for StarTech’s $32 2-port ExpressCard adapter.

I loaded the software, inserted the card in the ExpressCard slot and let the hardware install itself. Once installed, the card’s two USB 3.0 ports increased throughput with an external drive from 26.6Mbyte/s to 87.1Mbyte/s, more than a threefold improvement.

11. Add HDMI
Rather than having an HDMI port, the EliteBook came with a DisplayPort video connector. DisplayPort works well with my monitor in the office, but when I travel, I have to connect a projector using the system’s VGA port, which doesn’t handle audio. It’s an easy fix with a DisplayPort-to-HDMI adapter, which averages about $10 to $20.

If you don’t have a DisplayPort connector, look for a VGA-to-HDMI adapter, which should cost about $25 to $30.

12. Upgrade to 802.11ac
The current Wi-Fi standard is still 802.11n, but you may want to consider upgrading to 802.11ac. While it lacks formal IEEE approval, 802.11ac is stable and works with existing gear. The new protocol can receive data at up to 1.3Gbps of data flow, about three times that of 802.11n.

I chose Netgear’s AC600 WiFi USB Mini Adapter ($50) because it sticks out only an inch from the notebook. It improved my Wi-Fi reception from 3 bars to 5 bars on the Windows Wi-Fi signal strength meter in my laptop’s task tray and extended the system’s range by 15 ft. More to the point, my online access speed rose from about 9Mbit/s to 12Mbit/s.

13. Update your BIOS
As long as you’re updating the hardware, it’s a good idea to update the system BIOS. This should be done whenever a new version comes out, but it’s easy to ignore. In the case of my HP laptop, the most recent version fixes a few problems.

Start by finding and downloading the newest BIOS for your system (usually, you can find it on the manufacturer’s support page) and transferring it to the desktop so it is handy. The new BIOS usually comes with a transfer utility that controls the data flow. Do the transfer with the machine plugged in — if the transfer is interrupted, your computer may not start.

14. Check to see everything’s working
Before I returned the EliteBook to service, I wanted to verify everything inside was working properly. I used HP’s Support Assistant software, which came with my EliteBook. A good alternative is AVG’s TuneUp Utilities 2014 ($50; free 15-day trial), which can do everything from removing duplicate files and cleaning up a hard drive to fixing Registry problems and making the system start faster.

It’s also a good idea to make absolutely sure that your upgraded notebook can take the heat. I use the PassMark BurnInTest ($39; free version available), which runs a variety of tasks simultaneously while noting any faults. This is harsher treatment than you’ll give it on a normal basis, but it is a good test of the system’s mettle.

An overall improvement

All told, the upgrade of my HP EliteBook took about three hours. Increasing the system’s RAM to 16GB meant that its PassMark PerformanceTest score of 937.1 went up to 1,241.7. After I swapped the EliteBook’s hard drive for a high-speed SSD, the score rose to 1,750.7 — nearly double the original system’s performance.

Of course, all these upgrades weren’t free. The bill for the various materials I used for my upgrade added up to $642 — about the price of a new budget machine.

However, you can pick and choose which upgrades you really need and which you don’t. The result will be a rejuvenated laptop that you can depend on — or that you can confidently pass on to a friend or relative.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

14 steps to a better, faster Windows laptop

Written by admin
January 18th, 2014

After a couple of years, most laptops fall seriously behind the times. Here are several ways to upgrade your legacy laptop to meet today’s standards.

Time to upgrade
I recently explained how you can clean out your laptop and increase its efficiency. However, sometimes what you really need is an upgrade.

What follows is a step-by-step rundown on how to give a legacy notebook a new lease on life by adding more RAM and a solid state drive (SSD), as well as the latest USB, HDMI and Wi-Fi hardware. As an example, I used my three-year-old HP EliteBook 2560p laptop, which is powered by a second-generation 2.6GHz Intel Core i5 processor and includes 4GB of RAM, a 320GB hard drive and Windows 7 Professional.

(I have not upgraded my operating system, although you can always switch to Windows 8 or Linux if you choose.)

Grab your tools and let’s get started.

1. Grab your tools
The first step is to make sure you have the necessary tools and components. Everything is available online or at a good electronics store. These can vary depending on your needs, but should be good for upgrading most notebooks. First, let’s start with the tools you’ll need:

– A small Philips screwdriver (I used a number 0)
– A DVD-RW disc or USB drive (for creating startup media)
– An external hard drive (for backing up data)
– A pencil with an eraser (for pushing RAM into place)
– A small bowl to hold screws and other easily lose-able objects
– A marking pen for labeling

2a. Choose your components
The best strategy is to shop around to find the best — and the most reasonably priced — components for your particular system. The parts that I bought for revamping my EliteBook are listed below. I chose these for two reasons: Because I’m familiar with the vendors and/or products, and because they fit my system.

This list shows what I paid; prices may have changed since then (clockwise, from top):

– Cable Matters Gold Plated DisplayPort HDMI adapter ($10)
– Netgear A6100 WiFi USM Mini Adapter ($50)
– Kingston 8GB SDRAM Module model DDR31333MHz (two at $90 each — more about this on the next slide)
– Crucial M500 480GB SSD ($370)
– StarTech 2 Port ExpressCard SuperSpeed USB 3.0 Card Adapter ($32)

2b. Select the right memory
Adding RAM to a computer boosts performance considerably. The EliteBook’s 4GB of RAM was skimpy to say the least, but rather than boosting it to 8GB or 12GB, I decided to add 16GB of RAM so I could squeeze every last bit of speed out of it.

I used Kingston Technology RAM modules, but you can get the parts from a variety of sources such as Crucial, Patriot Memory, PNY or any of the dozens of others. Most vendors offer an online ordering system, where you type in the model and details about your machine. For my EliteBook, I ordered two 8GB DDR3 SDRAM modules. At $90 each, the 8GB modules aren’t cheap, but I figured the extra performance would be more than worth the price.

2c: Select your storage: Hard drive or SSD?
Instead of swapping the EliteBook’s rather slow 320GB hard drive with a larger one, I decided to replace it with a 480GB solid-state drive (SSD), which is roughly five times faster at reading and writing data, can take more abuse and uses less power than rotating media does. Since much of my current work now resides in the cloud, the storage space should be quite sufficient.

On the other hand, an SSD costs roughly five to six times more than a hard drive for a lot less storage space. For instance, the 2.5-in. Crucial M500 SSD I installed cost me $370 versus about $60 for a 500GB hard drive. So it all depends on your own needs.

3. Getting inside
After turning the machine off and removing its battery, I slid the bottom panel free. Laptops differ widely in accessibility; some feature several small hatches rather than a single removeable bottom. If that’s your case, look on the bottom panel for a chip icon or other label.

The first task will be to upgrade the RAM. After unscrewing and removing the panel, you should find the memory modules.

4. Replacing the RAM
Once you’ve found the RAM, press down on the board’s edges with a pencil eraser to release the module and remove it. Then line up the contacts of each new module with those on the motherboard. Do it one at a time at a 45-degree angle and press down until the memory board snaps into place.

When you’re done, replace the battery, turn the system on and make sure that the new RAM is working properly by right-clicking on the Computer entry in Windows Explorer and clicking on Properties. This brings up the Windows System page. In this case, the system recognizes the new memory, so we’re ready to roll.

5a. Create a start-up drive: Windows 7
Now, it’s time to upgrade the storage.

But first you need to create an external start-up drive to use after you’ve installed the new blank drive. Either a DVD or a USB drive will work. (I prefer a DVD, because I can put it away for future emergencies.)

Go to the Control Panel’s Backup and Restore page. If you’re using a blank DVD, click on “Create a system repair disc” and “Create Disc.” It takes about 10 minutes to compile and burn the disc. If you’re using an external USB drive, go to “Create system image” instead and, the the set-up window, select the drive to save it to.

5b. Create a start-up drive: Windows 8
For those using Windows 8, it’s a little more involved. Go to the Control Panel’s File History option. Then click on System Image Backup in the lower left corner. Choose whether you want the backup saved on a hard drive, DVD or a network location.

After picking the drive to back up from, confirm what you’re doing and click on Start Backup to get it going. The system will then format the drive or ready the DVD, and start the process. Near the end, you’ll get a Create System Image window that lets you make a startup disk to get the machine going with a blank drive in place.

Some vendors offer software tools to help create recovery media. Check your manual or with your vendor.

6. Back up your data
With the start-up media done, it’s time to back up the existing hard drive using an external USB hard drive. (If you have a large-enough cloud storage account, you can back everything up online, but it will likely take much longer.)

After you attach the external drive, go to Windows’ Backup and Restore page and click on “Set up backup.” Highlight the external drive as the data’s destination and select “Let me choose.” You can then set the software to copy every file.

Next, click on the data you want copied: Local Disk (C:). Click Next to start.

7. Remove the old drive
Depending on how much data you have, it could take an hour or two to move all the data to the external hard drive (it took me about an hour and 15 minutes). When it’s finished, shut the system down, flip the machine over and find the hard drive. I just left the bottom panel off during the back up; if you haven’t, remove it again.

Loosen the four screws that hold the hard drive in place and put the screws into a small bowl. Finally, slide the drive out by its plastic tab.

After that, carefully loosen the screws that attach the drive to its bracket and put the drive aside. (You might want to use it as a spare.)

8. Install the new drive
It’s time to install the new drive. After screwing the new hard drive or SSD onto the drive bracket, slide the drive into place and finish up by screwing the bracket into the notebook. You can also replace the bottom panel.

Now you have to fill the drive up again. Boot Windows from the start-up disc you created. My EliteBook boots from its optical drive if a disc is present; other systems may require that you change the BIOS settings to boot from its optical drive. For notebooks without an optical drive, use the same external drive you used to create the start-up disc.

9a. Reload your OS: Windows 7
It will take a few minutes to start the system, format the new drive and load Windows onto it from the DVD disc. Once it’s done, boot your new drive, click on “System Recovery Options,” select “System Image Recovery” and the laptop will then find the backup files on the external hard drive. Click Next to start the restoration process of moving the data to the new drive.

From here on out, it’s all automatic. It should take about an hour or so. In other words, it’s time for a coffee break.

9b. Reload your OS: Windows 8
To restore from a backup using Windows 8, start by rebooting the computer while holding the shift key. This will bring up the troubleshooting window.

Click on Advanced options and then on System Image Recovery to start the process. After picking the backup image you want to use (the one that was just made), click Finish and the system will begin to copy your files onto the new drive.

If you want to upgrade to Windows 8 from Windows 7, it’s fairly simple: Instead of using your startup DVD, put the upgrade disc into the drive, select everything that you want to move (Windows settings, personal files and apps) to the new OS and click Next. Plan on it taking a couple of hours.

10. Add USB 3.0
When my EliteBook came out three years ago, USB 3.0 was a luxury, but now even budget machines use this faster standard. Happily, the system has an ExpressCard slot that can take a USB 3.0 adapter card. (Unfortunately, a regular PC Card slot isn’t fast enough to keep up with USB 3.0.)

There are a variety of cards available from vendors like Sonnet and Sabrent; I opted for StarTech’s $32 2-port ExpressCard adapter.

I loaded the software, inserted the card in the ExpressCard slot and let the hardware install itself. Once installed, the card’s two USB 3.0 ports increased throughput with an external drive from 26.6Mbyte/s to 87.1Mbyte/s, more than a threefold improvement.

11. Add HDMI
Rather than having an HDMI port, the EliteBook came with a DisplayPort video connector. DisplayPort works well with my monitor in the office, but when I travel, I have to connect a projector using the system’s VGA port, which doesn’t handle audio. It’s an easy fix with a DisplayPort-to-HDMI adapter, which averages about $10 to $20.

If you don’t have a DisplayPort connector, look for a VGA-to-HDMI adapter, which should cost about $25 to $30.

12. Upgrade to 802.11ac
The current Wi-Fi standard is still 802.11n, but you may want to consider upgrading to 802.11ac. While it lacks formal IEEE approval, 802.11ac is stable and works with existing gear. The new protocol can receive data at up to 1.3Gbps of data flow, about three times that of 802.11n.

I chose Netgear’s AC600 WiFi USB Mini Adapter ($50) because it sticks out only an inch from the notebook. It improved my Wi-Fi reception from 3 bars to 5 bars on the Windows Wi-Fi signal strength meter in my laptop’s task tray and extended the system’s range by 15 ft. More to the point, my online access speed rose from about 9Mbit/s to 12Mbit/s.

13. Update your BIOS
As long as you’re updating the hardware, it’s a good idea to update the system BIOS. This should be done whenever a new version comes out, but it’s easy to ignore. In the case of my HP laptop, the most recent version fixes a few problems.

Start by finding and downloading the newest BIOS for your system (usually, you can find it on the manufacturer’s support page) and transferring it to the desktop so it is handy. The new BIOS usually comes with a transfer utility that controls the data flow. Do the transfer with the machine plugged in — if the transfer is interrupted, your computer may not start.

14. Check to see everything’s working
Before I returned the EliteBook to service, I wanted to verify everything inside was working properly. I used HP’s Support Assistant software, which came with my EliteBook. A good alternative is AVG’s TuneUp Utilities 2014 ($50; free 15-day trial), which can do everything from removing duplicate files and cleaning up a hard drive to fixing Registry problems and making the system start faster.

It’s also a good idea to make absolutely sure that your upgraded notebook can take the heat. I use the PassMark BurnInTest ($39; free version available), which runs a variety of tasks simultaneously while noting any faults. This is harsher treatment than you’ll give it on a normal basis, but it is a good test of the system’s mettle.

An overall improvement
All told, the upgrade of my HP EliteBook took about three hours. Increasing the system’s RAM to 16GB meant that its PassMark PerformanceTest score of 937.1 went up to 1,241.7. After I swapped the EliteBook’s hard drive for a high-speed SSD, the score rose to 1,750.7 — nearly double the original system’s performance.

Of course, all these upgrades weren’t free. The bill for the various materials I used for my upgrade added up to $642 — about the price of a new budget machine.

However, you can pick and choose which upgrades you really need and which you don’t. The result will be a rejuvenated laptop that you can depend on — or that you can confidently pass on to a friend or relative.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

The Taiwanese PC maker is bringing its Padfone device to the US late this year

PC maker Asus is taking the Windows-Android hybrid concept to another level with a convertible laptop that can switch between the two OSes with the

The Asus Transformer Book Duet TD300 comes as a 13.3-inch notebook with a detachable keyboard, which when removed turns the device into a tablet. But unlike other PC convertibles, the new Asus product can switch between the Windows and Android OSes in either tablet or laptop mode.

By pressing an “OS Switch” button located on the screen, the product will alternate to the other operating system in five seconds. The device itself runs Intel’s fourth-generation Haswell chip, and will arrive in late March with a starting price of US$599.

At the International CES show on Monday, the Taiwanese PC maker also announced that it would finally bring its Padfone product to the U.S., in a partnership with AT&T. The new model, called the Padfone X, is another hybrid that can turn from a smartphone into a tablet.

+ ALSO ON NETWORK WORLD Best of CES 2014: In Pictures | A complete list of stories from CES 2014 +

Like the previous models, it works as an Android smartphone that can be docked inside a tablet when snapped inside. The handset portion of the Padfone X comes as a 5-inch smartphone with a 2300mAh battery, while the tablet has a 9-inch screen.

The Padfone X will arrive in the U.S. midyear, use the latest Qualcomm processor and support 4G LTE.

In addition, Asus on Monday unveiled its series of Android handsets called Zenfones that will be released in this year’s first quarter in markets outside the U.S. The Zenfones will come in 4-inch, 5-inch and 6-inch screen versions, and all use variants of Intel’s latest Atom processor. Asus also introduced a Padfone mini device that can switch between a 4-inch phone and a 7-inch tablet.

Software on the phones will use Asus’ new “Zen UI,” an interface that incorporates more than 200 company-made enhancements.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

The Syrian Electronic Army hacked all of Skype’s social media accounts and accused Microsoft of helping the government spy and monitor our email.

It’s said there is no rest for the wicked, and New Year’s Day had Skype social media managers scrambling to scrub evidence of being hacked off of its Skype blog, Twitter and Facebook accounts. That evidence was planted by the Syrian Electronic Army and accused Microsoft of spying for the “governments.”

After the SEA’s attack, Skype sent out a pair of tweets to its 3 million Twitter followers, warning:

Hacked Skype tweet warns against using Microsoft products

Skype tweet stop spying on people

Those Skype tweets were deleted and then replaced with this tweet: “You may have noticed our social media properties were targeted today. No user info was compromised. We’re sorry for the inconvenience.”

The SEA also hacked the Skype blog:

Skype blog, Facebook, Twitter hacked by Syrian Electronic Army

Hacked Skype blog says don’t use Microsoft products

These posts were mirrored on Skype’s Facebook page before quickly being deleted.

Skype Facebook hacked posts removed

Then reporter Matthew Keys tweeted this screenshot “proof” of the Skype hack sent to him by the SEA.

Screenshot Skype hack

The SEA also tweeted Steve Ballmer’s contact information along with the message, “You can thank Microsoft for monitoring your accounts/emails using this details. #SEA”

Although the SEA has successfully hacked many major companies, the Skype hack seems to be referring to Microsoft’s alleged cooperation with the NSA. Microsoft denied providing backdoor real-time access, but revelations provided by Edward Snowden indicated that the NSA can successfully eavesdrop on Skype video calls. Although Microsoft vowed to protect users from NSA surveillance, the Redmond giant “forgot” to mention Skype in its promises.

As security expert Graham Cluley pointed out, “Chances are that Skype didn’t read my New Year’s resolution advice about not using the same passwords for multiple accounts.”

In fact, Skype seems to have disregarded its parent company’s advice. Microsoft’s Security TechCenter has a post regarding “selecting secure passwords.” Regarding “Password Age and Reuse,” it states:

Users should also change their passwords frequently. Even though long and strong passwords are much more difficult to break than short and simple ones, they can still be cracked. An attacker who has enough time and computing power at his disposal can eventually break any password. In general, passwords should be changed within 42 days, and old passwords should never be reused.

Skype itself has a few password “rules” such as:

A password must:

Be at least 6 characters and not longer than 20 characters.

Contain at least one letter and one number.

Not have any spaces.

Not contain your Skype Name (case insensitive).

Not be a part of Skype Name (case insensitive).

Your password also cannot contain any of the following words:

1234, 4321, qwert, test, skype, myspace, password, abc123, 123abc, abcdef, iloveyou, letmein, ebay, paypal.

However, after the Skype hack gave Microsoft a black eye with spying accusations, it’s a pretty safe bet that whoever controls Skype social media will no longer resuse the same password to protect all of the company’s accounts. And if you reuse the same password on different sites, it would be a great 2014 resolution to change all your passwords, keep them in a password safe, and make sure you don’t use the same one for multiple sites.

 


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

How to get a job in financial IT

Written by admin
January 2nd, 2014

IT practitioners with the right mindset — analytical and dogged — will find themselves welcome in the resurgent financial services industry.

It’s been a rough few years for people seeking work in the finance industry. Its growth-by-acquisition strategy over the last decade meant that companies needed fewer new workers, and its contribution to the economic meltdown of a few years ago soured it to a lot of potential employees.

And salaries haven’t exactly skyrocketed either. Data from a recent survey by recruiting firm Mercer in conjunction with Gartner shows salaries for several finance IT positions are on par with — but not statistically higher than — those positions for the whole IT industry.

Even so, signs indicate that financial services, including IT, is heating up again. While salaries may be comparable, total compensation (including bonuses) was considerably higher than the national average for three of the four finance IT positions cited in Mercer’s survey results. Additionally, Robert Half Technology ranks financial services among the top five fastest-growing industries in six of the nine U.S. regions it tracks.

What’s changed on the hiring front in the financial services industry? In a word: technology.


 

MCTS Training, MCITP Trainnig

Best comptia A+ Training, Comptia A+ Certification at Certkingdom.com

 

 

Which smartphone is the most secure?

Written by admin
December 25th, 2013

Not all mobile phone operating systems are created equal. As Spencer McIntyre of SecureState explains, there are unique differences and threats specific to each smartphone and, in the end, security is largely up to the user

These days, it is almost impossible to meet someone who doesn’t own a cell phone. More specifically, smartphones, whether it be the trendy iPhone, corporate favored Blackberry or modern Windows Mobile, almost everyone has joined the smart phone frenzy — and with good reason. A smartphone offers more advanced computing ability and connectivity than a contemporary phone.

Just like a handheld computer, most of the population relies on their operating system to multitask the demands of work, personal life and finances. However, many Smartphone users forget about the risks of malware on these crucial devices. In fact, a study from Rutgers’s University disclosed that malicious software for cell phones could pose a greater risk for consumer’s personal and financial well-being than computer viruses.

Clearly, there is a need for greater protection of cell phone software and greater awareness of cell phone vulnerabilities from owners, especially when it comes to what kind of operating system you are using. There are unique differences and threats specific to each Smartphone. Here are some important key points that consumers should consider to protect their mobile operating systems.

iPhone
There is a lot to be found regarding this popular device, half of our research findings surrounded the iPhone. Malware for this device took a different approach with the release of IOS 4. The multitasking that users take part in on their systems easily goes unnoticed, allowing the presence of malware to be easier to miss and less intrusive. Malware is more commonly found on iPhones that have been jail broken.

“Jail breaking” means freeing a phone from the limitations imposed by the wireless provider and in this case, Apple. Users install a software application on their computer, and then transfer it to their iPhone, where it “breaks open” the iPhone’s file system, allowing you to modify it; however, this also opens it up to malware. By jail breaking a phone, users are possibly allowing malicious applications into their device which has access to their personal information including their bank account. These applications are not subjected to the same limitations as Apple and therefore are easier to get from a rogue reference and infect cell phone.

Additionally, by not changing the password on a jail broken iPhone, the SSH service, is easy for malicious attackers to create worms used to infect the users operating device. An example of how important this threat is to note was highlighted by Ike, a worm created to raise security awareness when it comes to using these jail broken devices. It illustrates how once the core app has run its route, the vulnerability can gain complete control of the system.

Apple is slow to pinpoint vulnerabilities, including the SMS (texting) exploit released in the summer of 2010 by Charlie Miller. This also revealed that Apple is so slow to release that third party organizations were able to produce a security patch before Apple.

Windows Mobile
When it comes to threats, Windows Mobile takes the cake when it comes to attracting malware via SMS. Specifically the amount of SMS malware found on Windows Mobile devices is much higher in comparison to others. An interesting facet of the Windows Mobile OS is that many of the system calls are shared with it’s full-featured desktop counterparts. This detail has contributed to many pieces of malware that have originated on the Windows OS being ported to the Windows Mobile OS. A noteworthy example of this is the Zeus botnet that in recent years has begun to appear on mobile versions of Windows.

BlackBerry
A popular alternative to the previous two mobile operating systems, the BlackBerry is also quite different from the typical smart phone. The BlackBerry uses what is arguably the most closed source of the operating systems discussed herein. Research In Motion, the developers of BlackBerry have done an excellent job of keeping the sensitive inner workings of this smart phone a secret from the public. This is a contributing factor for the relatively small number of reliable exploits for the BlackBerry smart phone.

BlackBerry also suffers from the multitasking concerns that make it easier for malware to run unnoticed. An interesting proof of concept developed for the BlackBerry is the BBProxy application that was presented at DEFCON.

Symbian
There is not a lot of information regarding malware for this operating device, although it is the oldest of the smart phones and one of the most popular outside of America. Windows, Blackberry and Symbian are malware populated and not present on Android or iPhone. Along with the Windows Mobile family of Phones, Zeus has be ported the Symbian as well. The mobile version of Zeus is being used to intercept text messages sent as the second factor of authentication in many services.

Android
The Android operating system is the only open source operating system discussed herein. Android is unique in that it is community driven. The Android operating system is not owned by an individual organization, so it is developed in the best interest of the users. However, the applications are not monitored for vulnerabilities in the marketplace, so anyone can submit applications containing malicious functions which are less likely to be caught. Essentially, it is up to the users to determine if it is a safe and reputable source from which they are getting the app.

Amazon now has a 3rd party market place, which imposes additional policies and restrictions on applications that are distributed.

Android is based on the Linux operating system. On Linux, availability on Android is unlike others and there is not much evidence of ported malware. This is not because there is not any known Linux malware out there, but because it doesn’t receive much attention.

In Conclusion
All operating systems have distinct strengths and weaknesses; however, many are the same and essentially are up to the user and the configuration of the password. Users need to remember not to install apps from unnecessary sources, especially if they are unknown. While users can’t know them all, users need to ensure that they are from a reputable source. If not, that is where malware commonly comes from, with backdoor apps masquerading as secure applications. Also, jail broken phones are at a huge risk if the user maintains the default password and an even higher risk if not used in the Apple marketplace. Instances of malware exist on all of the phones and are even more relevant on ones using untrusted app sources. Consumers can keep this research in mind when using their smartphone to best protect their valuable information.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Coincidence? Bruce Schneier — outspoken opponent of NSA mass surveillance — leaving British Telecom

Carrier BT provides the British intelligence agency GCHQ and its American counterpart the NSA with direct access to customer data through the Internet modems it supplies, claims a 50-page document posted anonymously on the Cryptome site today.

The document, titled “Full Disclosure: The Internet Dark Age,” was originally post on Dec. 4 and reposted with updates a few days later. Its anonymous authors, calling themselves “The Adversaries,” say they are engineers in a business that supplies small office and home office networking in the United Kingdom. The document they posted goes into extensive detail about their claim that modems supplied by BT have secret backdoors that can be used both to send outgoing customer data directly to the U.K. and NSA intelligence agencies, or even to give surveillance agencies a means to attack, should that be required.

+ Also on Network World: Slideshow of NSA’s weird alphabet soup of secret spy programs and hacker tools | Debate rages: Should the NSA be reformed? +

BT spokesperson Kris Kozamchak, head of BT Global Services, would not comment on the contents of the document, but simply stated: “We comply with the law wherever we operate and do not disclose customer data in any jurisdiction unless legally required to do so.”

Also on the security front at BT, Bruce Schneier, who has held the post of “security futurologist” at the company for about eight years, is leaving the telco at the end of December, according to a spokesman. Since the disclosures in June about the National Security Agency related to documents leaked to the media by former NSA contractor Edward Snowden, Schneier has been an outspoken opponent of the type of mass surveillance, backdoors and encryption weakening alleged to be done by the agency and its partners, which include GCHQ. Schneier’s commentary about the NSA appeared frequently online.

Schneier could not immediately be reached for comment.

 


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com