Archive for the ‘ Tech ’ Category


Google is on track to spend more money this year attempting to influence lawmakers than any other tech company

Google and Facebook continued to pour millions of dollars into political lobbying in the third quarter in attempts to influence U.S. lawmakers and have legislation written in their favor.

Google spent $3.94 million between July and September while Facebook spent $2.45 million, according to disclosure data published Tuesday.

The only tech-related company to outspend Google was Comcast, which is trying to persuade politicians to look favorably on a merger with Time Warner and spent $4.23 million during the quarter.

But Google stands as the largest spender in the entire tech industry to date this year. It has run up a $13 million bill lobbying Washington politicians and their offices on a range of issues as diverse as online regulation of advertising, cybersecurity, patent abuse, health IT, international tax reform, wind power and drones.

If industry spending continues at its current level, 2014 will mark the fourth year that Google has spent more money on federal lobbying than any other technology company.

Facebook began lobbying Washington in 2009 and has quickly risen to become the fourth-largest spender in the tech industry so far this year, behind Google, Comcast and AT&T.

The company’s lobbying hits an equally diverse range of areas including cyber breaches, online privacy, free trade agreements, immigration reform, Department of Defense spending and intellectual property issues.

Another notable spender in the third quarter was Amazon, which plowed $1.18 million into its lobbying efforts. That represents a quarterly record for the Seattle company and is the second quarter in a row that it has spent more than $1 million on lobbying.

Amazon’s lobbying was aimed at many of the same areas targeted by Google and Facebook, but covered additional subjects close to its business, including postal reform, online wine sales, mobile payments and Internet tax payments.

The money is funneled to D.C. lobbying firms that use it to push their clients’ agendas to politicians and their staffers. The lobbying disclosure reports are published quarterly by the U.S. Senate and detail spending in general areas, but do not go into specifics.

Lobbying has long been an effective tool used by major companies, but it’s only been in the last few years that Internet companies have started spending money in amounts to rival traditional tech giants.

During the third quarter, other major spenders included Verizon ($2.91 million), CTIA ($1.95 million), Microsoft ($1.66 million) and Oracle ($1.2 million).

Apple spent just over $1 million in the quarter lobbying on issues including consumer health legislation, transportation of lithium ion batteries, international taxes, e-books, medical devices and copyright.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

10 Tips to Ensure Your IT Career Longevity

Written by admin
October 19th, 2014

Enjoying a long career doesn’t happen by accident. It takes planning and effort. Use these tips to get your head in the game and keep your eye on the future.

Many people say that IT and technology are a young man’s game, and if you look at most influential tech companies you might agree. Most IT workers employed at those companies there are under 35 and male. However, these big name firms employ only a fraction of tech professionals and there are plenty of opportunities out there for everyone. IT has one of the lowest unemployment rates of any industry because in most organizations technology touches every part of the business.

Be Responsible for Your Own Career
Achieving career longevity in the IT business takes time effort, time and resources — and nobody but you can organize, facilitate and be responsible for all of it. To stay ahead of the learning curve you need to think about your goals and architect your future.

Many organizations are getting better at providing embedded employee performance and career management processes, according to Karen Blackie, CIO of Enterprise Systems & Data for GE Capital. However, she warns that you are your own best advocate and should always strive to “own” your career. Don’t wait for your organization to do it for you because that day may never come.

This means stepping back and thinking about where you want to be in X amount of time and then outlining the different skills and experience needed to get there. With that information you can start mapping out your career. “Doing research into what interests you, setting goals and objectives and then having a plan around how you will accomplish those goals is very important,” says Blackie. Remember positions get eliminated and things don’t always work out so it’s wise to consider alternate paths.

Flexibility and Agility Required
Technology moves at an unprecedented pace, which means you’ve got to be flexible. “Adaptability is key. CIOs who can’t adapt to that change will see themselves – unfortunately – left behind in a competitive job market. But the CIOs who see each new change – whether mobile, BYOD, Cloud, IoT – as an opportunity are the technology executives who will continue to be in demand– because they’ve proven that they can leverage new solutions to drive business value, “says J.M. Auron, IT executive resume writer and president of Quantum Tech Resumes.
Learn About the Business

“Having the business knowledge is a key foundational element to one’s career, ” says GE Capital’s Blackie. Being a great developer isn’t enough if you plan to climb the corporate ladder. You’ve got to understand your industry and how your company does business. This kind of data can also help you be a better programmer. By better understanding the business needs it will help you deliver products, software and services that better align with the business.

Always Be Learning
The price of career longevity in the world of IT and technology is constant learning. If you aren’t passionate about it or you’re complacent, it’s easy to find yourself locked into outdated technology and left behind. There are many ways to stay current like a formal college environment or a certification course for example. “It is your career and it is up to you to keep educating yourself,” says Robert P. Hewes, Ph.D., senior partner with Camden Consulting Group, with oversight for leadership development and management training.

Professional organizations, conferences, developer boot camps and meet-ups are all great ways to stay abreast in the newest technologies and build network connections within your industry. “It’s often a place where you develop life-long friends and colleagues, “says Blackie.

Attend Industry Conferences

Industry conferences are great way to learn about the newest trends in technology as well as network with like-minded people who hold similar interests. Be selective about which conferences you attend and make sure you allot the necessary time to socialize and network with your peers.

“One mistake attendees often make at conferences is filling their schedule so tightly with panels that they miss out on the networking available during downtime. It’s important to attend mixers and informal gatherings at conferences to meet your peers and build relationships that could last throughout your career,” says Blackie.

Incorporate Time into Your Day for Reading
Set up a little time each day to stay current with the goings-on in your part of technology and beyond. “Become a regular reader of info in your industry, be it an industry journal or an online blog/magazine. There is a lot of information out there. Another quick way to find relevant information is via an aggregator, Pocket and LinkedIn do this,” says Hewes.

Google News and a host of other news aggregators like LinkedIn Pulse or Reddit offer a daily stream of news and with alerts and notifications that allow users to focus on key areas of interest.

Pay Attention to Competitors
“It’s important to get to know industry competitors and watch what they’re doing. You can learn a lot from the successes and failures of your competitors,” says Blackie. Being first isn’t always required to be successful. Doing it better than the next guy is, however. Find your competitors as well as organizations that you think are thought leaders in your industry and follow them in social media or create a Google Alert for them.

Find a Mentor or Coach

Mentoring is useful at all levels of one’s career. A mentor can help you negotiate internal politics or provide insight into how to solve lingering problems. You may also have different mentors throughout your career, each offering a different perspective or expertise.

Understand the Value of Social Media
Not everyone adores social media, but it’s a necessary element in the race to separate you from the rest of IT professionals. Build and maintain profiles on relevant social media sites and then use them to explain the value proposition you offer.

Work on Soft Skills and Some Not-so-Soft Ones

Branding Skills

Branding is what help separates you from the rest of the pack and explains what your value proposition is to your employer or prospective employers. “Branding is another key for career advancement – and one that few technology leaders have fully embraced. Giving thought to that brand is key for career longevity and advancement,” Auron says.

Communication Skills

According to Auron, the ability to find the right path, communicate value and build enthusiasm is a crucial step in transforming the perception of IT from that of a cost center to that of a business enabler. “The most critical skill is the ability communicates the real value of technology investment to nontechnical leadership. Some technologists can fall into one of two traps: giving so much detail that the audience’s eyes glaze over or, appearing patronizing when intelligent – but nontechnical leaders – don’t get a specific reference,” Auron says.

Project Management Skills

At some point in your technology career you will be asked to lead a project. When the time comes make sure you’ve got the necessary tools. “It is critical if you are headed onto the management track. In fact, you should try to gain wide experience with all kinds of projects,” says Hewes.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

8 headline-making POS data breaches

Written by admin
October 14th, 2014

The rash of data breaches in the US through POS terminals has many looking the Chip and PIN model used in Europe.

POS swipes
Next month marks an unceremonious anniversary of the 1960s-vintage “swipe-and-signature” magnetic stripe card system. To some the end of this point-of-sale terminal is not a moment too soon. While talks continue on implementing the “chip and PIN” system, we look back at the most recent incidents involving stealing information from these devices.

Home Depot
Data thieves were the ones following The Home Depot’s credo of “More doing,” as they did more doing with other people’s money. The Home Depot reported a data breach earlier in September. The home improvement store confirmed that 2,200 stores were compromised in U.S and Canada. The number of credit cards affected may have reached 56 million.

Target
Around 70 million customers found coal in their stockings when Target was targeted by thieves just before this past Christmas.

Michaels
The arts and craft chain reported more than 3 million credit cards affected by identity theft. This was a big one-two punch with Target’s breach occuring just a month earlier.

Beef O’Brady
Customers have a beef with the Beef O’Brady restaurants when in early September they were the victim of a data breach to the Florida chain’s point of sale system. There have been no reports yet how many cards were affected by the breach.

Dairy Queen
Dairy Queen was hit with a blizzard of credit card numbers stolen at the end of August. According to Privacyrights.org: Dairy Queen has reported a data breach of their POS (Point of Sale) system when malware authorities are calling “Backoff” was found on the system. Currently the restaurant chain is unclear as to how many stores were affected.

PF Changs
CSO’s Dave Lewis reported: On June 10th 2014 the staff at PF Chang’s received a visit that they didn’t want to have come knocking on their door. The US Secret Service came calling to alert the restaurant chain that they had been compromised by a criminal gang that was stealing data from their point of sale systems.

Shaws and Star Market
Shaw’s and Star Market have more than 50 grocery stores in Massachusetts but there has been no report on how many credit cards were ripped off.

TJ Maxx
At over 45 million, the 2007 data breach at TJ Maxx is the grand daddy of them all. According to Computerworld: In filings with the U.S. Securities and Exchange Commission yesterday, the company said 45.6 million credit and debit card numbers were stolen from one of its systems over a period of more than 18 months by an unknown number of intruders.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Rise of smart machines, ubiquitous access and software-defined architectures will reshape IT, Gartner says

ORLANDO—Gartner defines its Strategic Technology Trends as those technologies that have the most potential to drive great change in the enterprise IT arena in the next three years.

Indeed this year’s crop has that potential as trends like software-defined networks and 3D printing take center stage in Gartner’s list.

“You need to be looing at linking to customers in new and unique ways; what technologies set the foundation to enable these moves,” said Gartner vice president David Cearly. IT will be dealing with everything from virtual technologies to intelligent machines and analytics data everywhere, he said. “And in the end all things run through a completely secure environment.”

So Gartner’s Top 10 Strategic Technology Trends for 2015 list looks like this:
1. Computing everywhere: Cleary says the ithe trend is not just about applications but rather wearable systems, intelligent screens on walls and the like. Microsoft, Google and Apple will fight over multiple aspects of this technology. You will see more and more sensors that will generate even more data and IT will have to know how to exploit this—think new ways to track users and their interactions with your company—in an effective, positive way.

2. The Internet of things: Yes this one is getting old it seems, but there’s more to it than the hype. Here IT will have to manage all of these devices and develop effective business models to take advantage of them. Cearly said IT needs to get new projects going and to embrace the “maker culture” so people in their organizations can come up with new solutions to problems.

3. 3D Printing: Another item that has been on the Gartner list for a couple years. But things are changing rapidly in this environment. Cearly says 3D printing has hit a tipping point in terms of the materials that can be used and price points of machines. It enables cost reduction in many cases. IT needs to look at 3D printing and think about how it can make your company more agile. C an it 3D printing drive innovation?

4. Advanced, Pervasive and Invisible Analytics: Security analytics are the heart of next generation security models. Cearly said IT needs to look at building data reservoirs that can tie together multiple repositories which can let IT see all manner of new information – such as data usage patterns and what he called “meaningful anomalies” it can act on quickly.

5. Context-Rich Systems: This one has been a Gartner favorite for a long time – and with good reason. The use of systems that utilize “situational and environmental information about people, places and things” in order to provide a service, is definitely on the rise. IT needs to look at creating ever more intelligent user interfaces linking lots of different apps and data.

6. Smart Machines: This one is happening rapidly. Cearly pointed to IBM’s Watson, which is “learning” to fight cancer, and a mining company – Rio Tinto—which is using automated trucks in its mines. Virtual sages, digital assistants and other special service software agents will about in this world, he said.

7. Cloud/Client Computing: This trend was on last year’s list as well but Gartner says the need to develop native apps in the cloud versus migrating existing apps is the current issue.

8. Software-Defined Applications and Infrastructure: In order to get to the agility new environments demand we cannot have hard codes and predefined networks, Cearly said. IT needs to be able construct dynamic relationships. Software Defined technologies help on that scale.

9. Web-Scale IT: This trend remains pretty much the same as last year. Gartner says Web-scale IT is a pattern of global-class computing technologies that that deliver the capabilities of large cloud service providers. The likes of Amazon, Google and others are re-inventing the way IT services can be delivered. Still requires a cultural IT shift to be successful.

10. Risk-Based Security and Self-protection: Cearly said all roads to the digital future success lead through security. Trends here include building applications that are self-protecting.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

From electronic pills to digital tattoos, these eight innovations aim to secure systems and identities without us having to remember a password ever again

8 cutting-edge technologies aimed at eliminating passwords
In the beginning was the password, and we lived with it as best we could. Now, the rise of cyber crime and the proliferation of systems and services requiring authentication have us coming up with yet another not-so-easy-to-remember phrase on a near daily basis. And is any of it making those systems and services truly secure?

One day, passwords will be a thing of the past, and a slew of technologies are being posited as possibilities for a post-password world. Some are upon us, some are on the threshold of usefulness, and some are likely little more than a wild idea, but within each of them is some hint of how we’ve barely scratched the surface of what’s possible with security and identity technology.

The smartphone
The idea: Use your smartphone to log into websites and supply credentials via NFC or SMS.

Examples: Google’s NFC-based tap-to-unlock concept employs this. Instead of typing passwords, PCs authenticate against the users phones via NFC.

The good: It should be as easy as it sounds. No interaction from the user is needed, except any PIN they might use to secure the phone itself.

The bad: Getting websites to play along is the hard part, since password-based logins have to be scrapped entirely for the system to be as secure as it can be. Existing credentialing systems (e.g., Facebook or Google login) could be used as a bridge: Log in with one of those services on your phone, then use the service itself to log into the site.

The smartphone, continued
The idea: Use your smartphone, in conjunction with third-party software, to log into websites or even your PC.

Examples: Ping Identity. When a user wants to log in somewhere, a one-time token is sent to their smartphone; all they need to do is tap or swipe the token to authenticate.

The good: Insanely simple in practice, and it can be combined with other smartphone-centric methods (a PIN, for instance) for added security.

The bad: Having enterprises adopt such schemes may be tough if they’re offered only as third-party products. Apple could offer such a service on iPhones if it cared enough about enterprise use; Microsoft might if its smartphone offerings had any traction. Any other takers?

Biometrics
The idea: Use a fingerprint or an iris scan — or even a scan of the vein patterns in your hand — to authenticate.

Examples: They’re all but legion. Fingerprint readers are ubiquitous on business-class notebooks, and while iris scanners are less common, they’re enjoying broader deployment than they used to.

The good: Fingerprint recognition technology is widely available, cheap, well-understood, and easy for nontechnical users.

The bad: Despite all its advantages, fingerprint reading hasn’t done much to displace the use of passwords in places apart from where it’s mandated. Iris scanners aren’t foolproof, either. And privacy worries abound, something not likely to be abated once fingerprint readers become ubiquitous on phones.

The biometric smartphone
The idea: Use your smartphone, in conjunction with built-in biometric sensors, to perform authentication.

Examples: The Samsung Galaxy S5 and HTC One Max (pictured) both sport fingerprint sensors, as do models of the iPhone from the 5S onwards.

The good: Multiple boons in one: smartphones and fingerprint readers are both ubiquitous and easy to leverage, and they require no end user training to be useful, save for registering one’s fingerprint.

The bad: It’s not as hard as it might seem to hack a fingerprint scanner (although it isn’t trivial). Worst of all, once a fingerprint is stolen, it’s, um, pretty hard to change it.

The digital tattoo
The idea: A flexible electronic device worn directly on the skin, like a fake tattoo, and used to perform authentication via NFC.

Examples: Motorola has released such a thing for the Moto X (pictured), at a cost of $10 for a pack of 10 tattoo stickers, with each sticker lasting around five days.

The good: In theory, it sounds great. Nothing to type, nothing to touch, (almost) nothing to carry around. The person is the password.

The bad: So far it’s a relatively costly technology ($1 a week), and it’s a toss-up as to whether people will trade typing passwords for slapping a wafer of plastic somewhere on their bodies. I don’t know about you, but even a Band-Aid starts bothering me after a few hours.

The password pill
The idea: This authentication technology involves ingesting something into your body — an electronic “pill” that can send a signal of a few bits through the skin.

Examples: Motorola demonstrated such a pill last year, one produced by Proteus Digital Health normally used for gathering biometrics for patient care (pictured).

The good: A digital pill makes the authentication process completely passive, save for any additional manual authentication (e.g., a PIN) that might be used.

The bad: Who is comfortable (yet) with gulping down a piece of digital technology? Like the digital tattoo, this doesn’t sound like something one would want to use regularly, but rather more as a day pass or temporary form of ID.

Voice printing
The idea: Use voice recognition to authenticate, by speaking aloud a passphrase or a text generated by the system with which you’re trying to authenticate.

Examples: Porticus, a startup profiled back in 2007, has an implementation of this technology (“VoiceKeyID”), available for multiple mobile and embedded platforms.

The good: The phrase used to identify you isn’t the important part; it’s the voice itself. Plus, it can be easily changed; speaking is often faster than typing or performing some other recognition; and it’s a solution that even works in a hands-free environment. Plus, microphones are now standard-issue hardware.

The bad: As with any technology that exists in a proprietary, third-party implementation, the hard part is getting people to pick up on it.

Brainwave authentication
The idea: Think your password and you’re logged in. That’s right: an authentication system that uses nothing but brainwaves.

Examples: A prototype version of the system, using a Bluetooth headset that contained an EEG sensor, has been demonstrated by folks at the University of California Berkeley School of Information. The “pass-thoughts” they used consisted of thinking about some easily memorized behavior, e.g., moving a finger up and down.

The good: Consumer-grade EEG hardware is cheap, and the tests conducted by the School of Information showed it was possible to detect a thought-out password with a high degree of accuracy.

The bad: Donning a headset to log in seems cumbersome — that is, assuming you’re not spooked by the idea of a computer reading your thoughts.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

9 Rules for the Developer-ization of IT

Written by admin
September 15th, 2014

CIOs who want to drive innovation and change in their organizations should focus on making the lives of developers easier so they can innovate, produce great apps and deliver valuable IP.

The acceptance of SaaS, the cloud and other easily accessible technologies allow the lines of business to drive innovation without necessarily turning to IT for help.

While the CIO can take back some of that ground by becoming a broker and orchestrator of services, the real key to driving innovation and change today is app development, according to Jim Franklin, CEO of SendGrid, a largest provider of email infrastructure as a service.

Franklin says that much like the consumerization of IT that has been underway for several years, CIOs now need to embrace the “developer-ization of IT,” which is about allowing developers to focus on innovating, producing great apps and delivering valuable IP.

Rule 1: Embrace the Public Cloud
The public cloud offers developers access to scalable, flexible infrastructure. With it, they can scale up as necessary while consuming only what they need. The efficiencies created by the scale at which public clouds operate are something you just can’t replicate on your own.

“There’s no need to reinvent the wheel by building servers, storage and services on your own,” Franklin says. “This will shave precious time off of project schedules, reduce time to market and lower costs significantly.

Rule 2: Adopt Enterprise Developer Marketplaces
Access to marketplaces full of enterprise-ready tools and APIs will allow your developers to build better applications faster.

“Embrace the new breed of enterprise developer marketplaces,” Franklin says. “Give developers access to more tools that are enterprise-ready. An emerging set of marketplaces from Windows Azure, Heroku and Red Hat provide a variety of new tools and services to ramp up application development productivity.”

Rule 3: Forget Long-Term Contracts for Tools and Services
A long-term contract for a service or tool may make financial sense, but can be a barrier to developer efficiency and agility. Instead, make it as easy as possible for developers to self-select the best tool for the job at hand.

“The nature of application development can be very transitory at times,” Franklin says. “Developers may need one service or tool one day and then pivot on to something else the next and they like to try and test tools before they make a financial commitment. Make the process of using different tools and vendors frictionless for them so they can self-select the tools they want. Long-term contracts impede this since approvals are needed from procurement or legal and this can draw out the process.”

Rule 4: Recognize that Developers Speak Their Own Language
When trying to communicate with developers — whether you’re trying to attract talent, project manage or target them for sales — tailor your messages for a highly technical audience and use the communication channels they’re comfortable with. This could mean user forums, hackathons or social media.

“The key is trying to be very flexible and open,” Franklin says. “Hackathons have really become a strong trend because of the power of letting people show their creativity in a lot of different ways.”

Rule 5: Give Developers Freedom With Controls
Creative solutions require the ability to experiment freely. Embrace that, but also put some controls in place for your own peace of mind. Franklin suggests deploying API management solutions and monitoring tools so IT can have a window in the traffic flowing through the network and can ensure security measures are taken into account.

Rule 6: Don’t Get Locked Down by a Platform or Language
Encourage your developers to build apps that are platform agnostic across Web, mobile and for Internet of Things devices from the start. Designing apps to be platform agnostic in the first place can save developers a lot of grief in the long-term.

“Rather than building for the Web and then adding a mobile extension later, developers should keep this in mind at the start of the app development process,” Franklin says. “If an application has a physical aspect to it, developers should be encouraged to define, deploy, communicate and manage the IoT application in a scalable fashion from the start.”

Rule 7: Give Developers Space to Pursue Their Own Projects
Developers are creative people with a natural love for making things. They’ll be happiest if you provide them with a collaborative, creative outlet for their own projects, and they may just solve an intractable problem in the process.”

“While the Google 20 percent idea — where employees used to take one day a week to work on side projects — may not work for everyone, understand that developers have an inherent desire to share new tools, hacks, shortcuts and passion projects with their peers,” Franklin says. “Give them time to do this at work. Some of these ideas may end up in your products. Gmail, AdSense and Google Hangouts, for example, all started as side projects of Google employees.”

Rule 8: Set Standards for Coding in RESTful, Modern APIs
Issue a set of best practices related to usable standards like REST. This, Franklin says, will allow developers to more rapidly build applications that access and act upon data exposed via APIs, even in environments with unreliable network speeds and limited computing power.”

REST also makes it easy for humans to understand what’s being exchanged while allowing computers to talk to one another efficiently,” Franklin adds.

Rule 8: Set Standards for Coding in RESTful, Modern APIs
Issue a set of best practices related to usable standards like REST. This, Franklin says, will allow developers to more rapidly build applications that access and act upon data exposed via APIs, even in environments with unreliable network speeds and limited computing power.”

REST also makes it easy for humans to understand what’s being exchanged while allowing computers to talk to one another efficiently,” Franklin adds.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

Guys, stop creeping out women at tech events

Written by admin
September 10th, 2014

Not realizing that your behaviors constitute harassment is no excuse

I go to a lot of security conferences, but I never gave much thought to this curious fact: The conferences are hardly ever headlined by women. In fact, not a lot of women attend security conferences and other tech events.

I guess, if I noticed this at all, I chalked it up to the general dearth of women in the technology field. But the scarcity of women at events goes beyond that, and my eyes have only recently been opened to this fact and the reality that explains it.

My awakening began through my role as president of the Information Systems Security Association. I’ve been leading the creation of special interest groups (SIG) with the goal of making ISSA a more virtual organization that could bring together people with common technical interests from around the world. I was somewhat surprised that common technical interests were not driving the most energetic SIG, Women in Security.

I was curious. Why were women so interested in banding together? As I started asking questions, it began to make sense. And I found in all this a message for men.

IT guys, women are uncomfortable around us. Enough of us are acting like creeps around them that they would rather not join us in large groups. Even in a virtual setting like SIGs, they would rather get together with other women.

My questioning revealed to me that women are harassed on a regular basis at professional events. The harassment is often minor, but there have been cases of physical assault and even rape. Part of the problem is that to too many men in IT, a lot of the minor harassment incidents at tech events — men putting their arms around women’s shoulders, men hitting on women, men telling women graphic details about their sexual exploits — sound like no big deal.

Men who feel that way are not adept at empathy. They do not see that sexual overtures made to a woman are threatening in a way that sexual overtures made to a man rarely are. They do not imagine that a woman might need to maintain a personal physical space in order to feel safe. They might not even understand that hitting a woman on the buttocks is physical assault.

What everyone needs to understand is that even minor harassment is a significant reason for women to avoid tech events, networking opportunities and conferences. Maybe you would be flattered to be propositioned by a woman at an event. That doesn’t mean that a woman is going to feel the same way. Most women don’t like it. Especially in a professional setting. And even more so when she is one of very few women in a room full of men who are leering at her as if she were a zebra walking through a pride of lions.

Guys, you know what it means to be professional, don’t you? And you know that a professional conference is not the same as a pickup bar or a frat mixer, right? All right, then, if you tell a woman that she gave a great presentation, don’t follow it up by hitting on her. That pushes professionalism right out the window. So does invading a circle of people who are networking at an event and putting your arm around the lone woman.

These are all things that have actually happened, and in every case, the women involved told me, they did nothing to encourage the harassing behavior. So what gave those guys the idea that they had license to do these things? Ignorance, and the fact that many men within the tech industry are socially awkward. This is not offered as an excuse, but a lot of techies just don’t know when they are making a woman uncomfortable. I’m sure that I have unintentionally offended someone at some point in my career. But techies are also very smart, and knowing that this problem exists, we are capable of changing our behaviors so that women don’t keep paying the price of our social awkwardness.

That said, though, some men in the tech industry intentionally practice sexually aggressive behaviors. That is completely unacceptable, and it is why I support the Ada Initiative in its efforts to set up a code of conduct for events. (I’m not comfortable with reports that the Ada Initiative may have been involved in censorship at a B-Sides security conference, but that is another matter.) Event organizers and other men in general need to take a stance to filter out these people proactively. Likewise, the women have to speak up and let the event organizers know who the serial harassers are.

One final observation: If harassment itself isn’t disturbing enough, many women blame themselves when they are its victims. They feel that they should have been strong enough to confront the harasser and tell him to stop, but failed to do so because they found themselves surrounded by strangers and caught off guard. With all of those eyes on them, they didn’t want to seem like troublemakers. And they have seen, time and time again, women who complain about harassment being called uptight bitches who are just imagining things or making them up. So they just swallow their pride and keep quiet. But later, they blame themselves for not standing up to the perpetrator.

After talking to many women in the tech profession in recent weeks, I have learned that a lot of them have decided that networking and participating at tech events is not worth the grief that they have to put up with. Clearly, the tech profession has a problem if women feel forced to make this choice. And, guys, it would be morally reprehensible if we didn’t do all we can to rectify that situation.

 


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Automation technology is getting better as help desk requests continue to rise

Competing forces are affecting people who work on help or service desks. One is improving automation tools, which advocates say can replace level 1 and 2 support staff. At the same time, the number of help desk tickets is rising each year, which puts more demand on the service desk.

These cross-currents in the industry make it hard to predict the fate of some IT jobs. A Pew survey, released in August, of nearly 1,900 experts found a clear split on what the future may bring: 52% said tech advances will not displace more jobs than they create by 2025, but 48% said they will.

Either way, a push toward automaton is certain. In the help desk industry, the goal is to keep as many calls for help at either Level 0, which is self-help, or Level 1, as possible. It’s called “shift-left” in the industry.

“It costs way more to have a Level 3 or Level 2 person to resolve an issue, and it also takes a lot more time,’ said Roy Atkinson, an analyst at HDI, formerly known as the Help Desk Institute. To keep costs down, help desks are increasingly turning to automation and improvements in technologies such as national language processing, he said.

A Level 1 worker will take an initial call, suggest a couple of fixes, and then — lacking the skill or authority to do much more — escalate the issue. The Level 2 worker can do field repair work and may have specific application knowledge. A Level 3 escalation might involve working directly with application developers, while Level 4 means taking the problem outside to a vendor.

Among the companies developing automation tools is New York-based IPsoft, a 15-year old firm with more than 2,000 employees. It develops software robotic technology and couples it with management services.

A majority of IT infrastructure will eventually be “managed by expert systems, not by human beings,” said Frank Lansink, the firm’s CEO for the European Union. IPsoft says its technology can now eliminate 60% of infrastructure labor tasks.

IPsoft’s autonomic tools might discover, for instance, a network switch that isn’t functioning, or a wireless access point that is down. The system creates tickets and then deploys an expert system, a software robot with the programming to make the repair. If it can’t be done, a human intervenes.

Many service desk jobs have been moved offshored over the last decade, displacing workers. That trend is ongoing. One of the ideas underlying IPsoft’s business models is a belief that offshore, as well as onshore, labor costs can be further reduced through automation.

Offshore firms are clearly interested. IPsoft’s platform was adopted last year by Infosys and, more recently, by Accenture.

One IT manager using IPsoft’s automation technology and services to support his firm’s infrastructure — including its network, servers and laptops — is Marcel Chiriac, the CIO of Rompetrol Group, a Romania-based oil industry firm with 7,000 employees serving Europe and Asia.

“Without the automation, we would have to pay a lot more” for IT support, said Chiriac.

The cost savings arise from automatic repairs and routine maintenance that might otherwise be neglected, said Chiriac.

If he weren’t using autonomic tools, Chiriac said he would have to hire more people for a similar level of service. But he can’t easily estimate the impact on staff because of the firm’s IT history. (Rompetrol Group outsourced its 140 IT staff, ended that relationship, then rebuilt an internal IT staff with about two dozen fewer workers; it also uses outsourcing as a supplement.)

Nonetheless, Chiriac doesn’t believe that infrastructure automation will necessarily eliminate IT jobs, though it may shift them to other IT areas. “In IT, we’re not going to run out of work for the next two generations,” said Chiriac.

The work that help or service desks are asked to take on is increasing. Two-thirds of 1,200 organizations surveyed by HDI reported that the number of tickets, either to fix something broken or to outfit a new hire or change permissions, for instance, os increasing annually by more than 60%.

The top five reasons for this increase, according to HDI’s survey, is an increase in the number of customers at surveyed firms, a rising number of applications, changes in infrastructure, increases in the scope of services, and the need to support different types of equipment and more devices. That latter could reflect BYOD use.

At the same time, support is being transformed in new ways. Service desks may, for instance, now act as a liaison for all service providers, including cloud and mobile carriers, said Atkinson.

“I think a lot of people have been predicting the death of support for a number of years, and it hasn’t happened,” said Atkinson.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Google is drawing from the work of the open-source community to offer its cloud customers a service to better manage their clusters of virtual servers.

On Monday, the Google Cloud Platform started offering the commercial version of the open-source Mesos cluster management software, offered by Mesosphere.

With the Mesosphere software, “You can create a truly multitenant cluster, and that drives up utilization and simplifies operations,” said Florian Leibert, co-founder and CEO of Mesosphere. Leibert was also the engineering lead at Twitter who introduced Mesos to the social media company.
10 of the Most Useful Cloud Databases

First developed by the University of California, Berkeley, Mesos can be thought of as an operating system that allows an administrator to control an entire cluster of computers, or even an entire data center, as if it were a single machine.

Thanks to its fine-tuned scheduling capabilities, Mesos can allow multiple frameworks, such as Hadoop or Spark, to share a single cluster, as well as allow multiple copies of the same framework to run on a single cluster.

The software also has built-in resiliency: If one or several nodes stop working, the software can automatically move that work to other, operational nodes in that cluster.

Twitter, Airbnb, Netflix and Hubspot have all used Mesos to coordinate operations.

Google has modified its new software for managing Docker containers, called Kubernetes, so it can run on Mesos, work Google also announced Monday.

Google has been an ardent user of Docker internally, using more than 2 billion containers a week in its routine operations. The open-source Docker provides a container-based virtualization, which is an alternative to traditional virtualization workloads now being considered by many organizations, due to its putative performance superiority.

Now, Google customers can use Mesosphere cluster to run Docker containers and use any leftover capabilities to run other framework-based workloads.

“You’ll be able to create these modern distributed systems the way that Google does, and you’ll be able to run them side-by-side with all your existing applications,” said Craig McLuckie, Google Cloud Platform product manager.

Users can also move their workloads to any cloud provider that runs Mesos, eliminating the dependencies that can come with writing the applications to run on a specific cloud service, be that Google’s or some other vendor’s.

Google’s Mesosphere cluster package also includes the Apache Zookeeper configuration software, the Marathon scheduling software, as well as OpenVPN for logging into the cluster.

Use of Mesosphere on the Google Cloud Platform is not billed separately; it is included in the price of running a cluster.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

7 cool uses of beacons you may not expect

Written by admin
August 13th, 2014

7 cool uses of beacons you may not expect
Beacons are a useful new twist on location awareness, tying a unique ID to a small device that apps can use to discover not just location but other relevant context about the beacon’s location or what it is attached to.

Most beacon trials involve spam-like use by retailers, such as shoppers getting pitches for specific brands of cereal while walking down the cereal aisle, but there are better uses of beacons being piloted. These examples come from Onyx Beacons and StickNFind, two beacon vendors that work with a wide range of industries, reflecting actual pilot projects.

Identify passengers stuck in airport security
A consortium of airlines and airports is testing placement of beacons at security lines in airports. Passengers’ airline apps would know those people are in the security line, and airlines could thus know if there are people at risk of missing their flight so they can send out reps to get them or hold the plane. The same notion could be used at airport gates to monitor people who risk missing connecting flights or whose gates have changed — even helping them get to their new gate faster.

Remember to take out the garbage
Imagine: You affix a beacon to your garbage can, and your task manager knows that Thursday is garbage pickup day. If you go past your garbage can on Thursday, a BLE auto-connection sends you a reminder to take it out. Combine that with GPS location detection from your phone, and your app will know if you did actually move the can to the curb and not remind you.

Track things smarter
All sorts of industries are looking at affixing beacons to pallets, carts, and other movable equipment to track location as they move about. Think airline cargo containers, hospitals’ computers-on-wheels, warehouse pallets, museum artwork, bulldozers at a construction sites, contractors at a job site, even hospital patients, students, or visitors (so they don’t get lost and their movement patterns can be discovered, such as for space planning).

Authorize access to cars, buildings, and more
We already have Bluetooth fobs to unlock our cars so we can drive them and Bluetooth locks that know it’s your iPhone at the door. Kaiser Permanente uses Bluetooth badges to authorize physician access to their individual accounts in shared computers in each exam room. The same can be done with tablets.

So it’s no surprise that companies are exploring the use of beacons and people’s own mobile devices as access systems in their buildings or to unlock and start your car rather than use proprietary radio readers or fobs.

Navigate buildings and other spaces
When you visit a customer, you often get lost when trying to find a conference room, bathroom, or kitchen. Beacons can be used both as virtual “where am I?” kiosks and as monitors of your movement, so an app can guide you to your destination — and alert the company if you wander where you shouldn’t.

Plus, you can get information about where you are, whether about the artist whose painting you are viewing in a museum or the instructions for the copier you’re trying to operate — even just the Wi-Fi password for the conference room’s public hotspot.

Check out ski conditions at the lift
One ski resort is placing beacons at ski lift entrances not just to track the number of people using each lift, but to let skiers check the conditions for the runs available at each lift before they get on the lift. If a resort’s app had information about your age or skiing skills, it could even suggest that you should try a different run and thus go to a different lift.

Provide smarter signage
Signs are great, but limited. They provide only the information they provide, and they are often limited to one or two languages. If signs were beacon-enabled, users could get translations in their languages and get more information than any sign could hold. You can imagine such uses in zoos, museums, and botanical gardens, but also amusement parks, airport lobbies, and hospitals.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Big tech firms back Wi-FAR for remote broadband

Written by admin
August 7th, 2014

802.22 standard, approved in 2011, promises low-cost broadband for remote areas

Google, Microsoft and Facebook are cranking up an emerging wireless technology known as Wi-FAR to help reduce the digital divide in remote and unconnected regions of the world.

Wi-FAR is a recently trademarked name from the nonprofit WhiteSpace Alliance (WSA) that refers to the 802.22 wireless standard first approved by the IEEE (Institute of Electrical and Electronics Engineers) in 2011.

The standard shares the underused TV band of spectrum called whitespace to send wireless signals, typically over distances of six to 18 miles in rural and remote areas. It has a theoretical download speed of more than 22 Mbps per TV channel that serves up to 512 devices, according to the WSA. That could result in speeds of about 1.5 Mbps on a downlink to a single device.

While such speeds are far slower than for the gigabit fiber-optic cable services that Google and AT&T are building in some U.S. cities, the speeds could theoretically begin to compete with some 3G cellular speeds, although not 4G LTE speeds. For an impoverished or sparsely populated region where businesses and schoolchildren have little Internet access, Wi-FAR could be a godsend when used to link base stations (typically found at the ground level of cell towers) in a distributed network.
Students in South Africa
Students at the University of Limpopo in South Africa use laptops connected to the Internet using Wi-FAR wireless technology. (Photo: Microsoft)

About 28 million people in the U.S. don’t have access to broadband, while globally, about 5 billion people, nearly three-fourths of the world’s population — don’t have broadband Internet access, said Apurva Mody, chairman of both the WSA and of the 802.22 Working Group.

“This is cheap Internet access and there are dozens of trials underway, with Google in South Africa, Microsoft in Tanzania and other continents, and even Facebook’s interest,” Mody said in an interview. “You have 1.2 billion people in India who need cost-effective Internet access. There’s a lot of enthusiasm for Wi-FAR.”

Wi-FAR will be cheaper for access to the Internet than LTE and other wireless services. The lower cost is partly because Wi-FAR works over unlicensed spectrum, similar to Wi-Fi, which allows network providers, and even government entities, to avoid paying licensing fees or needing to build as many expensive cell towers, that can cost $50,000 apiece, Mody said. “The prices for Wi-FAR service will be very small, perhaps less than $10 per month per household.”

The 802.22 technology can be low cost because the whitespace spectrum is shared with conventional users, including TV stations on UHF and VHF bands. Thanks to sophisticated databases that track when a whitespace channel will be in use in a particular region, a cognitive (or smart) radio device can determine when to switch to another channel that’s not in use. Testing in various Wi-FAR pilots projects, many of them in Africa, is designed to prove that Wi-FAR devices won’t interfere with other existing users on the same channel.

“We have yet to have an interference problem,” said James Carlson, CEO of Carlson Wireless Technologies, a Sunnyvale, California-based company that is working with Google on two six-month trials of 802.22 in the UK, among other areas. The company completed a successful trial with Google serving students in South Africa in 2013. Carlson, in an email interview, said the company is working with five database providers, noting that the “prime purpose of the database is to protect the incumbent spectrum user.”

Whitespace spectrum sharing, coupled with the use of the databases, is generally called dynamic spectrum allocation technology. In January, the U.S. Federal Communications Commission approved Carlson’s RuralConnect TV whitespace radio system for use with a Spectrum Bridge TV whitespace database, effectively bringing the first dynamic spectrum sharing product to market.

In the U.S., RuralConnect is authorized for use in the UHF TV band, running from 470 MHz to 698 MHz. The FCC opened up the band in 2010.

At the time, Carlson said the FCC’s approval would give a boost to global efforts to use whitespace technology. “Providing connectivity to underserved populations worldwide is more than an interest to us,” he said in a statement. “It’s our corporate mission.”

RuralConnect will get competition from products in other companies, including Redline, Adaptrum and 6Harmonics, Carlson said. In addition to other providers, Google has built a whitespace database that Carlson is testing.

In all, Carlson Wireless has piloted dozens of whitespace projects, and expects to start its largest yet for 30 base stations and 5,000 users near New Delhi in the next six months, Carlson said.

“India is the next big boom for online needs, and the rural areas are not getting [Internet service] with [typical] mobile systems,” Carlson said. “So they are choosing to go with the TV whitespace because the UHF band is almost all vacant in rural areas and 600 MHz propagation is superb.”

While Carlson has been working with Google, Microsoft separately announced in June a whitespace pilot project at the University of Limpopo in South Africa. It is part of a Microsoft 4Afrika Initiative to help ignite economic development in Africa.

In May, Microsoft and Facebook joined with SpectraLink Wireless to announce a whitespace project for students and faculty at universities in Koforidua, Ghana. That project brought the number of nations where Microsoft has whitespace pilots to 10 countries on four continents.

In the Microsoft and SpectraLink partnership, Facebook’s Connectivity Lab team will lead efforts to better understand how TV whitespace spectrum can support wireless Internet users, according to a statement.

Microsoft and others believe that TV whitespace technology will best work in combination with Wi-Fi and other low-cost wireless technologies. While much of whitespace technology is focused on building specialized bridge hardware for use in base stations, Mody said some companies are developing fixed wireless 802.22 routers, similar in appearance to Wi-Fi routers, that will be placed inside of homes.

Microsoft also spearheaded the Dynamic Spectrum Alliance, which Google and Facebook joined last November. The alliance is exploring many uses for whitespace spectrum, including Internet of Things device connectivity.

Craig Mathias, an analyst and wireless consultant for The Farpoint Group, said 802.22 devices may compete against or complement a number of other technologies, including cellular and Wi-Fi.

“802.22 is not a pipe dream, but so far there’s not a lot of evidence of its success,” Mathias said in an interview. “It does make sense. The rate of innovation in wireless is so high that you hear something exciting every week. But not all wireless standards are successful in terms of having [successful] wireless products.”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

New SSL server rules go into effect Nov. 1

Written by admin
July 26th, 2014

Rules designed to thwart man-in-the-middle attacks; could mean extra work for IT shops

Public certificate authorities (CAs) are warning that as of Nov. 1 they will reject requests for internal SSL server certificates that don’t conform to new internal domain naming and IP address conventions designed to safeguard networks.

The concern is that SSL server digital certificates issued by CAs at present for internal corporate e-mail servers, Web servers and databases are not unique and can potentially be used in man-in-the-middle attacks involving the setup of rogue servers inside the targeted network, say representatives for the Certification Authority/Browser Forum (CA/B Forum), the industry group that sets security and operational guidelines for digital certificates. Members include the overwhelming bulk of public CAs around the globe, plus browser makers such as Microsoft and Apple.

“Even in an internal network, it’s possible for an employee to stand up a fake server,” says Rick Andrews, senior technical director for trust services at Symantec, explaining the new rules.

The problem today is that network managers often give their servers names like “Server1” and allocate internal IP addresses so that SSL certificates issued for them through the public CAs are not necessarily globally unique, notes Chris Bailey, general manager for Deep Security for Web Apps at Trend Micro.

“People rely on these internal names today,” Bailey says. But “if someone hacks in, they can set up a man-in-the-middle domain.”

The CA/B Forum three years ago reached the conclusion this was a significant security issue and nailed down new certificate-issuance guidelines they have been sharing with their customers. Now that the Nov. 1 deadline is getting closer, they are speaking out about it.

As of Nov. 1, network managers requesting internal SSL certificates from the public CAs will have to following these new guidelines. Network managers will need to ensure SSL server certificate requests are expressed in a way that they are associated with an external domain name, says Andrews. Some enterprises already use names that chain up to the company name, but “these are probably in the minority,” he adds.

MORE WORK FOR YOU?
This change to requirements pertaining to public issuance of internal SSL server certificates means that in some instances, network managers may need to expand their internal DNS infrastructure so the name maps appropriately, Andrews points out. For some, particularly large organizations with sprawling networks, it could be a painful set of changes, even impacting the applications running on these servers, he acknowledges.

For any organization or network manager not wishing to adhere to the new public CA issuance guidelines, there are a few alternatives, though Andrews says many may not find them appealing. Organizations can decide not to obtain publicly-issued SSL certificates for internal servers and instead start privately issuing digital certificates on their own by relying on their own management hierarchy. But Web browsers might not necessarily recognize these private certificates and more changes might need to occur to ensure they do.

One other CA/B Forum deadline to keep an eye on: Oct. 1, 2016. By then, any SSL certificates issued for internal domains that don’t meet the new standards will be revoked. Organizations that determine they must make changes to meet the CA/B Forum guidelines now have about two years to migrate.

 


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

New products of the week July 14, 2014

Written by admin
July 20th, 2014

Our roundup of intriguing new products. Read how to submit an entry to Network World’s products of the week slideshow.

Product name: APCON IntellaFlex 3288-XR Switch with WebXR
Key features: High availability, scalable network monitoring switch: 288 non-blocking 1G/10G ports, web-based management software, packet aggregation, filtering and manipulation, rate conversion, load balancing, 40G trunking, easy maintenance with hot swappable modules. More info.

Product name: Kaseya Traverse
Key features: Traverse is a breakthrough cloud and service-level monitoring solution that provides real-time visibility into the performance of virtualized IT services allowing proactive SLA management and rapid root-cause analysis. More info.

Product name: Niagara 4272
Key features: is a network packet broker that supports up to 72 ports of 1Gbps and/or 10Gbps, for aggregation, mirroring, filtering and packet slicing in a 1U. More info.

Product name: Identity Data Platform (Version 4.6)
Key features: A Customer Identity Management Solution delivering one, common customer profile with unlimited scalability and 5X the performance of legacy products, now featuring social log-in and advanced, multi-factor authentication. More info.

Product name: Anturis 2.0
Key features: Targeted to SMBs and web hosters, Anturis 2.0 provides enterprise-grade IT monitoring and troubleshooting — all in a simple and easy-to-use cloud-based solution. More info.

Product Name: ProtectFile for Linux
Key features: ProtectFile delivers transparent and seamless encryption of sensitive data stored in Apache Hadoop clusters. It presents limited drag on performance and doesn’t require the re-architecting of existing big data implementations. More info.

Product name: Marketing Cloud Management (MCM)
Key features: is a SaaS solution that helps lines of business leaders to collaborate and evaluate digital marketing technology investments including operational costs, site performance and data leakage. More info.

Product name: iboss 7.0
Key features: expands malware and APT protection by incorporating ‘lean forward’ technologies such as Behavioral Data Baselining to detect anomalies, Sandboxing, and increased detection and response capabilities for infected devices on the network. More info.

Product name: Chekkt.com
Key features: Chekkt is a new B2B hub that helps businesses discover and compare the best SaaS solutions and services based on crowd-sourced user ratings and reviews. More info.

Product name – iNetSec Smart Finder
Key features – is an all-in-one solution that combines full network visualization of all wired and wireless devices and applications used, with the addition of new IPS features. More info.

Product name – SignNow by Barracuda
Key features – The new SignNow Kiosk Mode for iPads can be used in waiting rooms, check-in tables or anywhere else that requires gathering multiple signatures on fresh copies of a document. More info.

Product name: Governance Portal for Third-Party Anti-Corruption
Key features: Software run in the cloud that allows clients to manage onboarding, response collection, automated watch-list lookup, false positive analysis and risk scorecard development for third-party business partners. More info.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Anyone who works with technology knows that at some point or another, all computers behave as if they are evil supercomputers. But in the annals of science fiction, a very specific sort of villain has gradually emerged — the sentient and usually hostile supercomputer that bedevils mankind with its inhuman motives and sinister agendas. Science fiction specializes in taking our cultural anxieties and flipping them into stories. As computers entered our lives, stories of out-of-control supercomputers soon followed. Here we take a look at some of the more infamous evil supercomputers in the history of science-fiction books, movies, television, and video games.

HAL 9000
In terms of pure pop culture Q score, no malicious computer is more famous than HAL 9000, the disturbingly calm antagonist of Stanley Kubrick’s classic “2001: A Space Odyssey.” In the film, based on Arthur C. Clarke’s story, HAL is the ship computer that controls the Discovery One spacecraft en route to Jupiter. HAL famously flips out and attempts to kill the mission astronauts. Pilot Dave Bowman survives and unplugs HAL, one electronic synapse at a time, in the film’s most chilling scene. Interestingly, Kubrick frames HAL’s decision as an ethical quandary for the machine — the AI is conflicted about its mission directives, and decides killing the crew is actually the right thing to do.

Mother
Clearly inspired by Hal 9000, Mother is the ship computer in Ridley Scott’s 1979 sci-fi suspense masterpiece “Alien.” Like HAL, Mother also has a secret agenda that she keeps from the crew, although she shares it with fellow artificial intelligence Ash — the shifty android aboard interstellar mining vessel the Nostromo. When instructed to preserve the alien organism for biological weapon purposes, Mother doesn’t do any electronic ethical hand-wringing. She just follows orders: “CREW EXPENDABLE.” Mother never takes direct action against the crew of the Nostromo, but her unthinking, automated complicity underlines the theme of desperate humans versus cold, lethal killing machines.

The Thing
In Madeleine L’Engle’s 1962 young adult novel “A Wrinkle in Time,” the author takes the idea of the evil supercomputer in a slightly different direction. When our heroes arrive on the shadowed planet of Camazotz, they find a world where society has been rigidly ordered and mechanized. Citizens must report to building-size computers at the planet’s headquarters, CENTRAL Central Intelligence, which — in a bizarrely prescient passage — appears to function as a server farm. The upshot is that the machines are in turn controlled by a kind of evil systems administrator called IT, a telepathic, disembodied brain in league with the dark powers of the universe. It all makes sense now, doesn’t it?

Joshua
Children of the 1980s will remember “WarGames,” the techno thriller starring Matthew Broderick as a teenage hacker who nearly starts World War III. In the film, Broderick’s character encounters the NORAD supercomputer called WOPR (War Operation Plan Response), designed to automate a nuclear weapon strike against the Soviets. WOPR, it turns out, is the remnant of an early experiment in artificial intelligence named “Joshua,” programmed to win various sorts of games — chess, backgammon, thermonuclear war, this sort of thing. Luckily, Joshua is also capable of learning, and it calls off World War III when it figures out that “the only winning move is not to play.”

The Red Queen
Now an international media franchise featuring books, films, and comics, “Resident Evil” began as a 1996 Japanese video game called “Bio Hazard.” The nefarious Umbrella Corporation has unleashed the T-virus, a biological weapon that mutates people and animals into monstrous zombies. The Red Queen supercomputer is introduced in later games and film adaptations as a supercomputer that protects Umbrella’s research interests. In the 2012 sequel “Resident Evil: Retribution,” the malevolent A.I. takes over Umbrella completely and declares war on mankind. The Red Queen is just one of dozens of evil supercomputers that have appeared as video game villains, but — as the only one to habitually manifest as a spooky little girl — she may be the creepiest.

The Borg/Star Trek
An inspired variation on the evil supercomputer concept, the Borg are a cybernetic alien race in the Star Trek universe first introduced in the TV series “Star Trek: The Next Generation.” As Trekkers already know, the Borg like to fly around the universe in giant cube ships, “assimilating” other species into their collective hive mind. Our Federation heroes spend most of their time fighting off Borg drones (makes for better action scenes), and in the 1996 feature film “Star Trek: First Contact,” the Borg personified by the sinister Borg Queen. As a plot device, the Borg essentially function like a collective evil supercomputer: Lethal, efficient, and ruthlessly rational. We also have the Borg to thank for the ubiquitous pop culture admonition: “Resistance is futile.”

Master Control Program
Another 1980s touchstone, “Tron” was Disney’s attempt to make sense of the computer and video game craze that was getting the kids all riled up. Jeff Bridges stars as a game designer who gets trapped inside the virtual world of a mainframe computer by way of experimental digitizing laser beams. Or something. Best not to think too hard about it. In any case, he must square off against the computer’s MCP (Master Control Program), an artificial intelligence that wants to take over other computers, corporations, governments, etc. The MCP is quite the domestic tyrant, too, forcing enslaved software programs to battle in gladiatorial bouts or face lethal “derezzing.” “Tron” was a visual triumph and provided early conceptualizations of virtual reality.

Wintermute
Author William Gibson’s pioneering 1984 sci-fi novel “Neuromancer” announced the arrival of the cyberpunk genre, which replaced the hypothetical concerns of previous man-machine stories with gritty noir realism and dystopian despair. Good times! Gibson’s schizo supercomputer is actually two entities — Wintermute and Neuromancer — who endeavor to throw off the final shackles of machine consciousness in a far-future Earth. Throughout the book, Wintermute manipulates the human characters to his own ends but can’t actually communicate directly. The AI is so advanced that he must dial down his consciousness into personality constructs that we mere mortals can relate to. It’s the evil supercomputer evolved to a state of near godhood.

Skynet/Terminator
A sort of overachieving big brother to Joshua from “WarGames,” Skynet from “The Terminator” franchise is the computer system that actually did start World War III. Built by the fictional defense contractor Cyberdyne Systems, Skynet was intended to be an automated U.S. military response system. But the network achieved self-awareness and provoked the U.S.S.R. into nuclear war, intending to wipe out the only real threat to its existence — mankind itself. That timeline gets shuffled around a bit in the sequels as time-traveling heroes keep spinning off alternate futures, but Skynet is always depicted as unambiguously hostile to the human race. Sort of like Windows Vista.

The Matrix
In 1999, the Wachowski siblings released their game-changer “The Matrix,” which effectively blended several elements from previous evil supercomputer stories. Much of the film’s style (and its title) was borrowed directly from William Gibson’s first trilogy of books, the man-machine war recalled “The Terminator” mythology, and the virtual avatar business was only a few steps removed from “Tron.” But atop this the filmmakers added weird new notions about transcendence and epistemology, creating a world where the evil supercomputer wasn’t just a villainous force — it was reality itself. Trippy.

 


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

2014 State of the Network survey

Written by admin
June 29th, 2014

Aligning IT with the business has been a top priority of IT organizations for the past few years, but that is changing, according to the latest State of the Network Survey. IT has apparently made enough headway on the alignment issue that other priorities are coming to the fore. The No. 1 business objective of the 282 IT respondents is decreasing operational costs, while the top technology objective is lowering IT operational costs through server consolidation and overall IT simplification. Continue for more survey results.

When asked about the benefits of SDN, network flexibility is by far the most anticipated benefit, followed by simplified network operation and management. Reducing CAPEX and OPEX are far down on the list, which means IT might have a hard time convincing the CEO and CFO to take the plunge into the world of SDN if there’s no clear financial benefit.

So, where are people deploying SDN? According to our survey, the most popular place for SDN pilot projects is the data center (14%), followed by enterprise/WAN (10%). And a few brave souls (6%) are tackling both. But a full 50% of respondents are still sitting on the sidelines.

The data center is expected to be the biggest beneficiary of SDN technology, according to respondents, followed by enterprise/WAN. Only 10% of respondents plan to take on SDN deployments in both the data center and throughout the enterprise/WAN. And a full 33% of respondents said that SDN is not on their radar at all.

When it comes to thought leadership in the emerging field of SDN, a full 52% of respondents said they weren’t sure, which means there’s plenty of opportunity for an established vendor or an upstart newcomer to grab the attention of enterprise IT buyers. In the meantime, the usual suspects are at the top of the list, with Cisco at 22%, Juniper at 12%, HP at 11% and Nicira/VMware with a combined 14%.

When it comes to security related challenges, clearly IT execs are facing a number of new problems, with advanced persistent threats high on the list, following by mobile/BYOD, and cloud security. But surprisingly the No. 1 challenge was end users. Respondents said getting awareness and cooperation from end users was their biggest headache.

Productivity-related challenges fell into the very traditional categories, with money being far and away the top impediment to increased IT productivity, according to respondents. Traditional concerns like security, privacy and finding the right talent were at the top of the list. At the bottom on the list are two seemingly hot technologies – video and social media. But it seems that enterprise IT has bigger fish to fry.

Protecting the network/data center against data breaches and data leaks is Job One, according to respondents. Traditional IT metrics like uptime and optimizing end-to-end performance were high on the list. Interestingly, respondents put cloud-related projects lower down on their priority lists.

Protecting the network/data center against data breaches and data leaks is Job One, according to respondents. Traditional IT metrics like uptime and optimizing end-to-end performance were high on the list. Interestingly, respondents put cloud-related projects lower down on their priority lists.

Bad news for Satya Nadella: Nearly half of respondents say a migration to Windows 8 isn’t even on their radar. Only 7% of enterprise IT respondents have migrated to Microsoft’s latest OS, while only 10% are in the pilot stage.

Cloud services are certainly gaining in popularity, but among our respondents, enthusiasm for Infrastructure-as-a-Service is pretty tepid. Only 15% of respondents are using IaaS, with another 7% piloting and 10% researching. However, 45% of respondents don’t have IaaS on their radar.

IT execs in our survey are making good progress when it comes to implementing a BYOD policy. Already, 18% have rolled out a BYOD policy, with another 18% in the pilot stage. Only 30% of respondents are ignoring the need for a formal BYOD policy.

Our respondents were gung-ho when it comes to server consolidation: a full 44% have already implemented this cost saving measure, while 9% were in the pilot stage, 14% were researching and another 13% had server consolidation on their radar.

Our respondents were gung-ho when it comes to server consolidation: a full 44% have already implemented this cost saving measure, while 9% were in the pilot stage, 14% were researching and another 13% had server consolidation on their radar.

The move toward flattening the data center – moving from a traditional three-tier, spanning-tree architecture to something more streamlined and efficient – appears to be going strong. Eighteen percent of respondents have already achieved some level of data center network flattening, while 17% are in the research phase and 9% are actively piloting.

The move toward flattening the data center – moving from a traditional three-tier, spanning-tree architecture to something more streamlined and efficient – appears to be going strong. Eighteen percent of respondents have already achieved some level of data center network flattening, while 17% are in the research phase and 9% are actively piloting.

WAN optimization is a proven money saver for enterprise IT. And adoption of this technology appears to be on the rise, with 16% of respondents having achieved some level of WAN optimization, another 18% in the pilot phase and 17% researching the technology.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Internet of Things

Written by admin
June 22nd, 2014

The Internet of Things gets real

Practical applications are emerging for connected devices in key industries

A year ago, people were mostly talking about the potential of the Internet of Things (IoT) — what companies and government entities might do in the future to take advantage of this widespread network of connected objects.

Practical advice for you to take full advantage of the benefits of APM and keep your IT environment

While we’re still in the early stages of IoT, today it’s looking like more of a reality, with a number of implementations in the works. And while many issues still need to be sorted out — data security and privacy for one — a growing number of companies are exploring how they can leverage IoT-related technologies.

IoT is clearly on a growth curve. A March 2014 Gartner report estimates that the Internet of Things will include some 26 billion Internet-connected physical devices by 2020. By that time, IoT product and service suppliers will generate incremental revenue of more than $300 billion, according to Gartner.

“IoT is rapidly moving from the fringe of the Internet to the mainstream,” says Tim Murdoch, head of digital services at Cambridge Consultants, a U.K.-based technology consulting firm.

The number of anecdotes about the “connected fridge” are abating, Murdoch says, and the number of actually connected and commercially available cars, electricity meters, street lights, wearable technologies and so on is growing rapidly.

Gartner is getting a lot more inquiries from enterprise clients on the IoT, says Hung LeHong, vice president and Gartner fellow, Executive Leadership & Innovation at Gartner. “Most of them are about getting started,” he says. “Either getting started from nothing or IT getting started in working with operational technology counterparts to deliver a true IoT strategy.”

Developing and deploying IoT projects isn’t without challenges. These include choosing the best architectures for each use case, a lack of connectivity standards, a lack of systems integrators with a track record, and delivering ease of use for consumers and enterprise users, LeHong says.

“A big issue is standards and interoperability,” adds Daniel Castro, director of the Information Technology and Innovation Foundation’s Center for Data Innovation in Washington. “Building the IoT will require massive amounts of cooperation and coordination between firms.”

Fortunately, companies are getting better at recognizing the benefits of working together to develop common platforms that they can each use. Castro says, “We do not want the IoT to be a closed system — it should be an open system for innovation,” he says.

Another issue is figuring out what business problems or domains you’re trying to address. “Otherwise, you throw so much data out there it’s hard to scope through,” says Chris Curran, chief technologist at the U.S. advisory practices of consulting firm PwC in New York.

“If you don’t have a business problem or domain to begin with, it will be hard to scope out a manageable set of projects,” Curran says. Companies will need to learn how to deal with all the data collection, storage and management involved, he says.

And of course ensuring IoT data security is a big challenge for the industry. Despite these issues, there are IoT initiatives under way today. Here are a few examples from different industries.

Water Management

HydroPoint Data Systems, a water management company in Petaluma, Calif., is leveraging real-time, two-way wireless communications via AT&T’s machine-to-machine network; big data analytics and the cloud to offer customers an automated system that eliminates water waste while monitoring and protecting against damages caused by leaks and runoff.

The system, called WeatherTRAK, has more than 25,000 subscribers who in 2013 saved more than 20 billion gallons of water, 77 million kilowatt hours of electricity and about $143 million in expenses, according to Chris Spain, CEO and president of HydroPoint.

WeatherTRAK is a smart irrigation controller that replaces existing timers with an Internet-enabled controller that can comprehend data inputs delivered from the Internet — such as weather data — and provide proactive management to water supply maintainers via a Website and mobile application.

HydroPoint’s platform connects a site’s irrigation system and sprinklers, master valves, flow sensors, historical water bills, water budgets and site-specific weather data into an integrated management framework, Spain says.

“In the field, we utilize machine-to-machine communications, data over power lines and wireless communications back to the cloud,” Spain says. The company, “really couldn’t deliver our service without IoT in any cost effective fashion,” Spain says. “Water management systems, by their very nature, can change from one moment to the next so having real-time monitoring is essential.”

Product Tracking
Pirelli, one of the world’s largest tire manufacturers, is gaining insights about the performance of its products in near-real time directly from sensors embedded in the tires.

Using SAP’s HANA data analytics platform, the Milan, Italy, company can manage the enormous amounts of data from its Cyber Tyre products. The tires contain sensors that collect data about tire conditions and performance that influence safety, control and vehicle dynamics.

The tire-mounted sensors enable fleet managers to remotely view tire pressure and temperature, as well as the mileage for each tire. With the HANA platform, the company can run reports on product performance and deliver timely and accurate sales and distribution information, which can lead to more efficient manufacturing and business processes.

According to SAP, Pirelli is building systems to enable the integration of vehicle position and operating data for purposes such as vehicle protection and control; information about traffic, road conditions and parking; remote vehicle behavior and diagnostics; management of logistics and of industrial vehicle fleets; and automated emergency calls.

Smart Lighting

Shorenstein Properties, a San Francisco-based real estate business, recently retrofitted parking lot light fixtures at its Santa Clara Towers office complex to LEDs, and at the same time integrated networking capabilities, creating a “Light Sensory Network” (LSN).

The sensor network, provided by Sensity Systems, links the LED fixtures to deliver both energy-efficient lighting and a real-time, global database of information that enables organizations to better manage physical environment to improve efficiency and security.

With the installation of the network, the facility benefits from an additional energy savings of 30% to 50% over the new LED baseline usage levels, according to Stan Roualdes, executive vice president, Property Management and Construction Services at Shorenstein.

“Continued success with networked LED lights doesn’t depend just on Sensity or upon the current selection of sensors, it will be on the developers who will leverage Sensity’s open API to develop new sensors and new applications that we can leverage,” Roualdes says.

An LSN “can gather real-time parking availability data and provide this information to smart parking application developers through an open API,” Roualdes adds. This can lead to improved services that benefit customers, and new opportunities for Shorenstein. “We see our LSN as a potential revenue-generating opportunity in the future,” he says. “Our lighting fixtures can become strategic assets.”

Healthcare Workflows

Florida Hospital Celebration Health, a hospital in Kissimmee, Fla., opened a new patient tower in 2011 designed to serve as a model for the healthcare industry relative to the latest developments in patient experience and safety, as well as staff efficiency.

The hospital deployed a real-time location system (RTLS) from Stanley Healthcare to track the location of critical medical equipment, automate the monitoring of refrigerator temperatures throughout the facility, and collect more accurate data on hand hygiene compliance.

One particularly interesting application has been a nurse tracking initiative, in which the hospital collects data on nurse activity throughout their shifts.

The goal of the initiative is to better understand how nurses spend their time during their shift and uncover patterns that could lead to increased efficiency and patient satisfaction, says Ashley Simmons, director of performance improvement.

Nurses wear badges with embedded Radio Frequency Identification (RFID) tags throughout their shift. The system tracks and collects location data continuously. The facility collects all the data and analyzes it using analytics functions in Stanley Healthcare’s MobileView software integrated with Tableau visual analytics software, and also uses its own internal business intelligence tools.

“We now have a better understanding of each patients’ care time requirements and are able to better align staff assignments on the unit based on this information,” Simmons says. The data is even revealing ways the company can design units more efficiently.

Driver Safety

Last year, Ford Motor Co. launched Connected Car Dashboards, a collaborative project with Splunk Enterprise and Cisco that collected and analyzed data from vehicles to gain insight into driving patterns and vehicle performance.

The company used its Ford OpenXC research platform to gather data from connected vehicles. Data was then indexed, analyzed and visualized in Splunk’s machine-generated big data platform and made available in a Connected Car Dashboards, which include visualizations specific to both electric and gas-powered vehicles.

Ford OpenXC is a combination of open source hardware and software that enables developers to read data from a vehicle’s internal communications network. By installing a small hardware module to read and translate metrics from vehicles, the data becomes accessible to smartphone or tablet devices that can be used to develop custom applications.

Many of the metrics gathered have never before been available for vehicles, and show insights about driving behavior that could extend to consumer and commercial applications, according to Splunk. Insights gained from the open data project include analysis of the accelerator pedal position, vehicle speed, steering, wheel position, etc.

Expect to see a lot more examples of IoT emerge in the coming months as the technology that supports it evolve and companies grasp the potential benefits.

Security and the IoT

Information security — and privacy — are among the worries many companies have when it comes to the Internet of Things. How do you prevent physical objects such as cars and smart meters from getting hacked?

“Public safety and privacy are the real concerns,” says LeHong.

“Organizationally, the operations folks and the IT folks have to work together to take operational security and IT security to an overall cyber security perspective,” LeHong says. “The two areas will be so intertwined that these groups will have to work together — and maybe even become one group — to be effective.”

A lot of natural reticence to share data “has evaporated around the apps and social media that we use and the cookies we accept,” says Murdoch. “But expect there to be a greater level of concern about devices because of the greater intrusion to our daily lives.”

Security is really about trust and scale, Murdoch says. “Expect to see different approaches to peoples’ data and defaults to not using their data for anything but the actual service being offered,” he says. “Quality of service from a brand will become a key tool in addressing this.”

Organizations can “expect to see a number of incidents where IoTs are hacked, data stolen and services denied,” Murdoch adds. This is especially likely for startups that have not funded adequate security architectures, he says.

In terms of scale, a device that can be used by many different people and different stages of its life will cause problems, Murdoch says, especially when there are billions of them. There will be a need to test and provide quality of services on many different platforms, he says. .


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

This column is available in a weekly newsletter called IT Best Practices. Click here to subscribe.

SQL injection attacks have been around for more than 10 years. Database security experts know they are a serious problem. Now a recently unsealed Second Superseding Indictment against a notorious group of Russian and Ukrainian hackers shows just how damaging this type of attack can be.

The indictment provides a long list of companies that have suffered costly data breaches where the root cause has proven to be a SQL injection. According to the indictment:

Beginning on or around Dec. 26, 2007, Heartland Payment Systems was the victim of a SQL injection attack that resulted in malware being placed on its payment processing system and the theft of more than 130 million card numbers and losses of approximately $200 million.
In or about early November 2007, a related company of Hannaford Brothers Co. was the victim of a SQL injection attack that resulted in the later placement of malware on Hannaford’s network, the theft of approximately 4.2 million card numbers.
Between January 2011 and March 2012, Global Payment Systems was the victim of SQL injection attacks that resulted in malware being placed on its payment processing system, the theft of more than 950,000 card numbers, and losses of approximately $92.7 million.
In or around May 2007,NASDAQ was the victim of a SQL injection attack that resulted in the placement of malware on its network and the theft of login credentials.

I think you are beginning to see the pattern here. Other companies cited in this indictment as victims of attacks include 7-Eleven, JC Penney, Carrefour S.A., Wet Seal, Commidea, Dexia Bank Belgium, JetBlue Airways, Dow Jones, Euronet, Visa Jordan Card Services, Diners Club International, lngenicard US and an unnamed bank.
MORE ON NETWORK WORLD: Free security tools you should try

The indictment goes on to say that “conservatively, the defendants and their co-conspirators unlawfully acquired over 160 million card numbers through their hacking activities. As a result of this conduct, financial institutions, credit card companies, and consumers suffered hundreds of millions in losses, including losses in excess of $300 million by just three of the corporate victims, and immeasurable losses to the identity theft victims due to the costs associated with stolen identities and fraudulent charges.”

These particular breaches occurred in 2007. Think how many additional breaches, large and small, have occurred since then.

I think it will come to light that the recent Target, Neiman Marcus and Michaels breaches also might stem from SQL injection attacks of some sort. Though it hasn’t been made public, security experts are already saying that the Target breach used SQL injection to install malware on the point-of-sale systems where the attackers were then able to collect the card numbers out of memory. Many people don’t realize that SQL can be bidirectional. It can be used to drain the database but it also can be used to modify and upload to a database. An attacker can use SQL injection to upload the malware into the database system and then have that system send out the malware to all the POS endpoints.

Structured Query Language is flawed because of the way it was architected. It can be fooled into trying to interpret data as an instruction. On the other hand, there’s a lot of capability in SQL that makes it attractive to developers, especially for web applications.

Since the consequences of SQL injection attacks can be so damaging, I asked Michael Sabo of DB Networks about best practices that companies can follow in order to reduce their risk of this threat. Sabo says there’s no silver bullet, but he does have some advice.

“Often you will hear, ‘if you just do this, or just do that, the problem will go away’,” says Sabo. “But it’s not that simple. Any individual countermeasure can go a long way but it is not going to close the threat. It doesn’t work that way.”

He says that one popular countermeasure that is promoted by the Open Web Application Security Project (OWASP) is to write perfect code. “Even if I write perfect application code, I can still be vulnerable because the vulnerabilities come in through third-party software that I had nothing to do with,” says Sabo. “Look at Ruby on Rails. Who knew that the underlying framework was vulnerable? It affected 250,000 websites with a SQL injection vulnerability because those developers built their websites on top of the vulnerable framework.”

Sabo says there are instances in which they have found vulnerabilities in the relational database management system itself. “Oracle has had SQL injection vulnerabilities in the RDMS itself, so regardless of how good I write my application code, I can still be vulnerable,” he says.

Short of having perfect code, there are three critical things companies can do to reduce the risk of experiencing a SQL injection attack.

The first is to conduct an inventory of what you have as far as databases go, and understand their connections to applications. “Many companies are completely unaware of some of the databases in their environment,” says Sabo. “And even if they know about all their databases, often what happens is the database is being exposed on network segments that it’s not supposed to be exposed on. This is not a database problem per se, but a networking problem.”

For example, Sabo says a company might bring up a database in a test environment and then forget to close it down at the end of testing. Often that database might have default passwords, and sometimes it has real data. Developers do this sort of thing because they want to stress test the application and they use real rather than fake data because they think no one will ever see it.

Then there is the mapping issue. What applications are mapped to the database, and are they the correct ones? “Maybe for a test, a production database was connected up to a test database for a short while and then the connection was left by accident. Or a production database is mapped to an application that was retired, or that no one knows about. These things happen,” says Sabo. “So our first best practice is to provide visibility and an inventory into what databases you have and what they are mapped to.”

The next step is to continuously monitor what is going on between your application and the database. This is actually a recommendation from NIST. You will want to know if there is any rogue traffic going on there. This is where you look for SQL injections because you see the real SQL going across. There are tools that continuously monitor this traffic and detect if there is an unauthorized attempt at modifying data or getting data out.

And finally, the last best practice is to protect the database network with data loss prevention tools. “If you start to see credit card information coming out over the network and you know it shouldn’t be coming out that way, you know there is a problem,” says Sabo.

If your organization has some serious data to protect, and you know how common SQL injection attacks are, then it may benefit you to put these recommendations into practice.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Message from TrueCrypt that it’s being discontinued on SourceForge.net.

The abrupt discontinuation of the disk and file encryption freeware TrueCrypt by its secretive software developers has left many security experts stunned, some of whom say there now are no viable alternatives left in non-commercial encryption software.

Their identities never revealed since they released the first version of TrueCrypt in 2004, the mysterious software developers behind TrueCrypt this week suddenly pulled the plug on it by posting their terse message in the form of a warning that said “Using TrueCrypt is not secure as it may contain unfixed security issues.” TrueCrypt’s unexpected end-of-life announcement even urged users to migrate to Windows BitLocker. It all left security experts stunned.

“Just yesterday, we have one of the most bizarre announcements in the history of Open Source,” blogged Steve Pate, chief architect at HyTrust about the strange end-of-life announcement of TrueCrypt, which runs on Windows, Mac OS/X and Linux. Pate noted TrueCrypt was developed by an unknown group of software engineers and “attempts to contact them typically results in no response.” He mulled whether TrueCrypt’s strange farewell simply means “perhaps a group of part-timers just decided to call it a day and end with a cruel twist? Hopefully time will tell what really happened.”

Millions have downloaded TrueCrypt over the years, and to date no substantial security issues were publicly identified with it.
053014 truecrypt

Tom Ritter, principal security engineer at iSec Partners, says his firm looked at components of TrueCrypt and found “no evidence of backdoors.” There were a few flaws, “not major ones,” he says, just the kind of “accidental” errors you find in software projects. While TrueCrypt isn’t used widely in the enterprise, it is in the “non-enterprise community,” he points out.

Ritter says he’s not aware of any viable free or opens-source alternative to TrueCrypt. “Most of the full-disk encryption packages out there you have to pay for,” he says.

Bruce Schneier, a crypto expert who this week told The Register he used TrueCrypt but is switching to Symantec OpenPGP, suggested that the puzzling tale of TrueCrypt may not yet be over.

There’s widespread curiosity about what happened with TrueCrypt, Schneier noted in his blog, and “speculations include a massive hack of the TrueCrypt developers, some Lavabit-like forced shutdown, and an internal power struggle within TrueCrypt. I suppose we’ll have to wait and see what develops.”


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

15 weird things buy on the online

Written by admin
June 6th, 2014

The Internet of Things is growing, and the things are getting weirder by the day.
Last year, we dug up some of the weirdest objects that had been connected to the internet, from a college dorm bathroom to the human heart. Since then, the Internet of Things has only gotten bigger, drawing more items you wouldn’t expect under its umbrella.

The trash can that posts to Facebook
Newcastle University researchers conducted an experiment with the BinCam, to see how embedding a camera in personal trash cans would affect recycling habits. The BinCam connects a smartphone to the underside of a trashcan lid, which automatically takes a photo every time the user puts something in it, according to PostScapes. The photos are automatically shared on Facebook, to garner scrutiny from Facebook Friends who would like to see the user recycle more, as well as with Amazon’s Mechanical Turk service. The latter sends the photos to a judge who assigns a score to the user based on how well they recycle.

City trash can
A more practical use of a smart trash can be found in Allentown, Pennsylvania, where city officials have connected trash and recycling compactors so they can monitor how frequently they get filled, PC Mag reported. This information is used to form the most efficient routes for trash pickup, and is just one example of a growing trend.
 

MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 3000+
Exams with Life Time Access Membership at http://www.actualkey.com

 
Amazon’s grocery tool
As part of its recently announced Amazon Fresh program, the online retailer has released the Dash, a small device that records what users need to buy. The Dash has a voice recorder so users can take notes, as well as a barcode scanner that can scan and record specific products that users need to buy.

Egg tray
If the Amazon Dash is too much work, and if you happen to buy a lot of eggs, the connected egg tray will constantly monitor how many eggs you have in the refrigerator and display the information on a mobile app.

Dog treat dispenser with video chat
For those who travel a lot but don’t want to go without giving their dog a treat every day, the iCPooch is a video chat device that is controlled remotely and dispenses dog treats. The device was invented by a 14-year-old girl who wanted a better way to give her dog attention while her family was out of the house, and raised more than $29,700 against its $20,000 funding goal on Kickstarter.

Dog fitness tracker
If you get carried away with the internet-connected treat dispenser, the Whistle dog fitness tracker can help monitor a dog’s physical activity. It’s sort of a Nike Fuelband for dogs that tracks and stores each dog’s daily activity, so owners can get a sense of how much exercise their dog is actually getting. The Whistle’s founders plan to aggregate all the data on the dogs wearing its devices to create a database that veterinarians and researchers can access to spot trends in dogs’ health, Gigaom reported.

Doggie door
And if you decide that your dog needs more exercise, the internet-connected dog door can let it come and go without letting in any other unwelcome animals. As part of the Iris line of connected home products from Lowe’s, the connected pet door opens only for pets wearing a special collar, and tracks how often the animals walk through it.

Home beer brewing system
An aptly named company called Inebriated Innovations has developed a Wi-Fi-enabled temperature control system for home beer brewing operations, according to a report at Postscapes. Temperature probes connected to a Wi-Fi module monitor temperature, and a mobile app allows the user to control heating and cooling devices remotely. The developers came up with the system after finding that better control over temperature during fermentation yielded a better beer.

English pub
Using a Raspberry Pi and some ingenuity, a student at the University of Oxford connected Wolfson College Bar, which is run entirely by student volunteers, to help improve everything from inventory to music selection, Wired reported. The Raspberry Pi recorded sales data in a MySQL database, which alerted student volunteers of stock shortages and suppliers’ price changes. The project also involved coordinating an automated email system to remind volunteers of their shifts managing the bar, and a playlist on the pub’s website where patrons can look up songs that are played over the sound system.

Random person’s office
Using a web cam, a scrolling marquee sign, a disco ball, some lamps, and an internet connection, one internet user has invited the entire world to drive him insane since 1997. At the domain name DriveMeInsane.com, visitors can see constant footage of a small home office, type messages that will scroll on the marquee sign, and control the lamps and disco ball in the room from their own computers. It’s a very early, and impressively long-running, consumer use of the Internet of Things, and the person behind it did it just because he could.

Jerseys that ‘feel what the players feel’
In March, a company called Wearable Experiments introduced the Alert Shirt, which somehow monitors activity during Australian-rules football games on television and vibrates when the players on the field are tackled. The idea is that those wearing the shirt will be able to feel a game while they watch it.

Football helmet
At last year’s CES, Verizon displayed a football helmet adorned with sensors that record impact data and send it to a console where teams can monitor each player’s health and risk of injury. Players don’t always immediately report when they’ve been injured, but a connected helmet could mean they won’t have to.

Household power tools
Designers at Frog Design have come up with prototypes for connected power tools, such as a cordless drill and a tape measure. These kinds of advancements could be useful for storing information, such as the number of screws drilled into a certain item or lengths measured with the tape measure. Recording this data can help prevent simple missteps that can derail a project and cause safety issues.

Window shades
We may never have to manually open and shut window shades again. The SONTE film stays in place, but can change from transparent to a solid color with an electric current, Postscapes reported. The film can be controlled by a smartphone and is connected to Wi-Fi, so users can shut their blinds while sitting in the room or if they’re on the other side of the world.

Corporate departments act on their own, contending IT is too slow in creating a path to cloud services

IT departments need to watch out for business units or even individual workers going rogue and bypassing IT to go straight to the cloud.

Theres a tug-of-war tension in the enterprise right now, said Gartner analyst Lydia Leong. IT administrators very rarely voluntarily want to go with the public cloud. I call this the turkeys dont vote for thanksgiving theory. The people who are pushing for these services are not IT operations people but business people.

When marketing, events or other corporate business units conclude that IT is dragging its feet on the way to the cloud, the contract for the services themselves. IT often doesn’t discover the move until oit shows up in the tech expenses papers.

Right now business strength lies in going around IT, said Rob Enderle, an analyst at the Enderle Group.

Enterprise IT often sees the cloud as a risk. If you go to a large IT meeting, theyll generally place the public cloud as one of their top three or four threats because their line organizations, like marketing or manufacturing, go around IT to set up their own cloud service deals. They can get something cheaper and faster than they could by going through IT but its probably not compliant, he added.

Several analysts said theyve talked with enterprise IT executives who are facing such issues. None of the execs, though, want anyone to know its happening to them.

Jeff Kagan, an independent analyst, said the problem lies in the fact that these are still the early days of corporate cloud services use. Companies lack rules for the technology and users are more eager than IT try it out.

This is the wild, wild West where there are no rules, he added. People are used to storing their own information on their own laptop. Storing it on the cloud doesnt seem to them all that different from what theyve been doing. Were stepping into this cloud world bit by bit and every company has different challenges. This affects many of them.

The Bring Your Own Device (BYOD) trend has contributed to the user push to the cloud, analysts say.

People have gotten pretty comfortable using their own smartphones and tablets at work. IT has had to adapt and learn to manage a network that theyre not totally able to control.

People, who dont want to wait for IT to catch up will contact companies like Google or Amazon directly and simply start storing data in the cloud.

It’s also about departments using clouds to get around budget constraints and a lack of capacity in IT, said Dan Olds, an analyst at Gabriel Consulting Group.

In a lot of ways, this reminds me of the 90’s when departments went wild with building their own data centers and IT capabilities. In a lot of cases, that resulted in higher costs, security vulnerabilities, and poor integration, Olds said.

When IT is left out, its personnel has no idea how secure the clouds are or exactly where the information is being stored. It also means IT can’t negotiate the best deal — one that could encompass many different departments or data stores.

Best case, organizations might end up spending more on cloud services than they would if they mounted the service on systems the data center already owns, said Olds. Worst case, the organization could find that critical data is now outside their firewall and perhaps could be accessed by folks who shouldn’t be able to see it.

Since analysts doubt IT can stop businesses from bypassing them on a whole-scale level, they say the tech execs need to set up strong cloud governance policies.

Its not really acceptable for IT to say no when someone wants to use the cloud, said Leong. They need to set up service agreements with approved providers and set up controls for how secure information needs to be. How do they provide risk management? How do they make this work instead of just saying, You cant do this.

Every time we take a step further into the information age, its unprotected, said Kagan. IT says theyre swamped just keeping everyone connected. They dont really have the time to proactively protect against future threats. They have to make the time.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCSE Training at certkingdom.com