Our Most Popular Managed Services

If you need help deciding what services are best for your business let us know.

VT Logo header logo wrap shape

VT Logo header logo wrap shape

Award-Winning Dallas-Fort Worth IT Services.

Questions? Call (817) 595-0111

inner banner overlay

×

Error

The CEGCore2 library could not be found.

VersaTrust Blog

VersaTrust has been serving the Texas area since 1997 , providing IT Support such as technical helpdesk support, computer support and consulting to small and medium-sized businesses.

Dispelling the myths about containers

Business owners barely had time to acquaint themselves with virtualization before the next trend stormed onto the scene. Although container and virtualization applications both allow users to divvy up software and hardware more efficiently, containers have many advantages over virtualized machines. There are a number of misunderstandings though, and it’s time to set the record straight.

Containers are made up of the bare minimum hardware and software requirements to allow a specific program to run. For example, if you want to give employees access to a single Mac-based server application, but everything else you run is in Windows, it would be a waste to build a new machine for just that program. Containers allow you to partition just the right amount of hardware power and software overhead to run that Mac program on your Windows server.

Misconception #1: There is only one container vendor

Traditional virtualization technology -- which creates entire virtual computers rather than single-application containers -- has had two decades for vendors to enter the market and improve their offerings. Containers, however, didn’t break into the mainstream until a few years ago.

Fortunately, there are still more than enough container vendors. Docker dominates the industry and headlines, but there are at least a dozen other programs to choose from.

Misconception #2: Containers require virtualization

In the early days, containers could only be created and managed in the Linux operating system. This meant complicated and sometimes unreliable improvisation was required to benefit from container technology on Windows and Mac servers.

First, you would need to virtualize a full-fledged Linux install on your Windows or Mac server, and then install container management inside of Linux. Nowadays, container management software can run on Windows and MacOS without the confusing multi-layer systems.

Misconception #3: You can’t create and manage containers in bulk

Separate programs, known as Orchestrators, allow you to scale up your use of containers. If you need to partition more hardware power so that more users can use a container, or if you need to create several identical containers, orchestrators make that possible.

Misconception #4: Containers are faster than virtual machines

Obviously, virtualizing an entire operating system and the hardware necessary to run it requires more management and processing requirements. A lot of people tend to think this means containers are faster than virtualized machines. In reality, containers are just more efficient.

Accessing a container is as simple as opening it and using the application. A virtualized machine, however, needs to be booted up, a user needs to log in to the operating system, and then you can rummage through folders to open an application. Most of the time containers are faster, but there are instances when that's not true.

Virtualization and containers are complicated technologies. For now, just remember that 1) Virtualization and containers are separate technologies, each with pros and cons; and 2) you have plenty of software options to manage containers (sometimes in bulk). For anything more specific than that, give us a call!

Published with permission from TechAdvisory.org. Source.

Continue reading

NSA-approved: mobile virtualization

Server and desktop virtualization have been improving computing efficiency and data security for years. But with all the talk about mobile BYOD policies and corporate data protection on smartphones, the National Security Agency (NSA) believes virtualization is the key to true security. Here’s what you need to know:

US government approved

The NSA maintains a program named Commercial Solutions for Classified (CSFC) that tests and approves hardware to assist government entities that are optimizing security. For example, if a public sector network administrator is deciding which mobile devices to purchase for office staff, CSFC has information about which devices are approved for various government roles.

Offices in the intelligence community usually require virtualization hardware and software as a minimum for laptops and tablets. But until now, no smartphones that included the technology have passed the tests. However, a recently released model of the HTC A9 phone includes mobile virtualization functionality that got the green light.

What is mobile virtualization?

Virtualization is an immensely complicated field of technology, but when it comes to mobile devices the process is a little simpler. Like any mobile device management plan, the goal of mobile virtualization is to separate personal data from business data entirely. Current solutions are forced to organize and secure data that is stored in a single drive.

Essentially, current phones have one operating system, which contains a number of folders that can be locked down for business and personal access. But the underlying software running the whole phone still connects everything. So if an employee downloaded malware hidden in a mobile game, it would be possible to spread through the entire system, regardless of how secure individual folders are.

With mobile virtualization however, administrators can separate the operating system from the hardware. This would allow you to partition a phone’s storage into two drives for two operating system installations. Within the business partition, you could forbid users from downloading any apps other than those approved by your business. If employees install something malicious on their personal partition, it has no way of affecting your business data because the two virtualized operating systems have no way of interacting with each other.

Although it’s still in its infancy, the prospect of technology that can essentially combine the software from two devices onto a single smartphone’s hardware is very exciting for the security community. To start preparing your organization for the switch to mobile virtualization, call us today.

Published with permission from TechAdvisory.org. Source.

Continue reading

Windows Server 2016 and virtualization

Virtualization is a great way to save money and increase the efficiency of your existing IT hardware, but how exactly do you implement a virtualization solution? There are several vendors that provide software solutions, but there’s one almost everyone has already worked with: Microsoft. In its latest operating system release there are a few ways to virtualize your office.

A brief history of Windows Server

The Windows Server operating system has been around for decades. As an advanced option for onsite servers, this operating system grants access to high-level access management settings, DNS customizations, and network configuration management. In fact, it’s such a complicated solution that Microsoft offers certification courses for each version of the operating system.

The most recent iteration of this operating system is Windows Server 2016 (WS16). Released on October 12th, 2016, Microsoft’s latest server software included countless improvements to its networking and user management features. Where it really shines however, is in the ways it handles virtualized computing.

Virtualization in Windows Server 2016

As with just about anything in the virtualization world, containers dominate the WS16 conversation. Containers use software to aggregate the bare minimum requirements that one application needs to run -- hardware, software and operating system -- and deliver that package across a network to computers that lack one or more of those requirements. For example, if you want to run a Mac application that requires a huge amount of processing power on a bare-bones Windows workstation, you can create a container with the necessary components on your server and let the workstation access it remotely.

WS16 users have access to two types of container deployments: Hyper-V and Windows Server containers. To the average business owner, the differences between these two options is minute, but what is important is Microsoft’s commitment to compatibility. If virtualization is important to you, choosing WS16 is a great way to ensure that you’ll be ready for whatever develops among the disparate providers.

Another great virtualization feature in WS16 is software-defined storage (SDS). It’s a complicated solution, but it essentially allows you to create hard drive partitions outside of the confines of hardware limitations. You can create a single drive by pooling storage space from three different servers, or you can create several separate drives for virtualized workstations to access.

Obviously, managing a server is no easy task -- regardless of whether or not you implement a virtualized infrastructure. That complexity comes with some compatibility issues; if your business relies on old software, it may not have been updated to run with WS16. For everything from creating a transition plan to managing your virtualized framework, give us a call today.

Published with permission from TechAdvisory.org. Source.

Continue reading

What is virtual “sandboxing”?

Virtualization comes with several benefits for small- and medium-sized businesses. One of the most important is cybersecurity, but even within that subset are several strategies for protecting your organization. One of such strategy is referred to as sandboxing, and it’s worth learning about.

What is sandboxing?

Sandboxing is one of the rare concepts in virtualization that the average person can usually grasp in just a couple short sentences. Essentially, sandboxing is the practice of tricking an application or program into thinking it is running on a regular computer, and observing how it performs. This is especially useful for testing whether unknown applications are hiding malware.

Obviously, it gets far more complicated once you delve into the details of how you implement a sandboxing technique, but the short answer is that it almost always involves virtualized computers. The program you want to test thinks it’s been opened on a full-fledged workstation of server and can act normally, but it’s actually inside of a tightly controlled virtual space that forbids it from copying itself or deleting files outside of what is included in the sandbox.

An effective way to quarantine

Virtualization is no simple task, but the benefits of sandboxing definitely make the effort worth it. For example, virtualized workstations can essentially be created and destroyed with the flip of a switch. That means:
  1. You aren’t required to manage permanent resources to utilize a sandbox. Turn it on when you need it, and when you’re done the resources necessary to run it are reset and returned to your server’s available capacity.
  2. When malware is exposed inside a sandbox, removing it is as simple as destroying the virtual machine. Compare that to running a physical workstation dedicated solely to sandboxing. Formatting and reinstalling the machine would take several hours.
  3. Variables such as which operating system the sandbox runs, which permissions quarantined applications are granted, and minimum testing times can be employed and altered in extremely short periods of time.
This strategy has been around for nearly two decades, and some cybersecurity experts have spent their entire careers working toward the perfect virtual sandbox.

Containers: the next step in this evolution

Recently, the virtualization industry has been almost totally consumed by the topic of “containers.” Instead of creating entire virtual workstations to run suspicious applications in, containers are virtual spaces with exactly enough hardware and software resources to run whatever the container was designed to do.

Think of the metaphor literally: Older sandboxes came in a uniform size, which was almost always significantly larger than whatever you were placing into them. Containers let you design the size and shape of the sandbox based on your exact specifications.

Quarantined virtual spaces fit nicely into the sandbox metaphor, but actually implementing them is impossible without trained help. Whether you’re looking for enhanced security protocols or increased efficiency with your hardware resources, our virtualization services can help. Call us today.

Published with permission from TechAdvisory.org. Source.

Continue reading

Virtualization review: What is it again?

Every now and then, we need to reset the conversation about virtualization and review how it works in its most basic form. With so many advances, it can be hard to keep up if you’re not a regular reader. This article not only defines virtualization and its benefits, it also includes a real-world workstation for you to experiment with!

What is virtualization?

The simplest definition is this: It’s the act of creating a virtual (rather than physical) version of something, including hardware platforms, storage devices, and computer network resources. But that doesn’t do much for those outside of the IT industry.

We could paint a colorful analogy to try to better explain it, or we could let you paint with your very own virtualized demo. Follow these steps so you can see how virtualization works:

  1. Visit this website.
  2. Wait while your virtualized 1991 Macintosh boots up.
  3. Double-click the ‘Kid Pix’ desktop icon.
  4. Write “This is virtualization” on the blank canvas.
  5. Click (and hold) File, and select Save As.
  6. Click the Save button in the new window.
  7. Quit ‘Kid Pix’.
Voilà! Your picture was saved to that old-school Mac's virtual hard drive. That’s because everything -- from the operating system to the processor -- is running on a server located somewhere else on the internet. And it’s not just some remote desktop viewing trick, this ’90s-era Mac and its hardware have been created by software installed on a server that is concurrently processing a million other tasks.

It’s a fun demonstration, but modern-day virtualization can accomplish much more.

Divide up hardware resources

The dated nature of that machine actually helps us better illustrate the biggest benefit of virtualization. The software that lets us create virtual machines also allows us to define exactly how much hardware each workstation gets.

For example, this Mac has only 3.8 MB of hard drive space, but if your virtualization server has 10,000 GB of space, you can create 100 virtual desktops with 100 GB of storage space. It’s a bit of an oversimplification, but that’s essentially how it works with storage hardware, CPUs, RAM, and other hardware.

Reduce on-site costs

The bulk of your workstation and server hardware is usually hosted off-site, which means lower utility bills, computer equipment requirements, and maintenance overhead. Instead of patching and upgrading each workstation’s software and hardware individually, virtualization allows you to apply changes to all your machines at once.

Disaster recovery

If your virtualization server is hosted off-site, what happens when natural disasters, power outages, theft, or vandalism strikes your office? Or, as a simpler example, where did you store your Kid Pix masterpiece? Certainly not on the machine you’re reading this blog from.

Virtualization allows you to keep mission-critical data stored safely away from the office so your team can get back to work as soon as your IT provider gets them access to the server again. And with a single point of management (i.e., your off-site server), that can take place in virtually no time at all.

Ending your dependence on individual machines and their hardware is just one of the many ways to utilize the power of virtualization. You can define network hardware and configurations with software, run applications on any operating system, and so much more. To find out which solution is best for your business, call us today!

Published with permission from TechAdvisory.org. Source.

Continue reading

Microsoft and Citrix: a match made in heaven

Azure and XenDesktop may not be household names, but the newest partnership between Microsoft’s cloud platform and Citrix’s virtualization client are making big waves in the industry. Announced at Citrix’s annual partner Summit, the newest thing in virtualization is a win for everyone.

For those who don't know, Azure is Microsoft’s build-it-yourself cloud platform. With more than 600 services, Azure is all about giving network administrators access to Microsoft data centers to pick and choose how your cloud is structured.

Citrix is one of the largest virtualization software providers on the market. And its most famous product, XenDesktop, was one of the very first software solutions to allow multiple users to access Windows from a networked desktop with a different operating system already installed.

Now compatible with Windows 10

With the recent release of XenDesktop Essentials for Microsoft Azure, these two solutions are becoming one. Administrators can now build fully-stocked Windows 10 desktops stored in Azure, and employees can access them from any machine with Citrix’s lightweight client installed.

The whole setup costs only $12 per user, per month, and comes with a host of administration settings for managing and monitoring your virtualized desktops and how users access them.

A better way to work

It’s like Azure is a moving truck, XenDesktop is the box holding all your stuff in the back of the truck, and your company applications and settings are what’s inside the box. With the right configuration, the whole box can be delivered to employee desktops anywhere in the world.

As long as employees are accessing virtual desktops from verified devices running MacOS, iOS, Android, or even an older version of Windows, they can work as if they are sitting right in front of the Windows 10 install located within your company’s cloud.

Virtualization is a wonderful solution for cutting costs and increasing efficiencies. Unfortunately, even with two of the most user-friendly vendors in their respective industries, virtualizing Windows 10 desktops is still a monumental task. For 24/7 access to support and expert advice, call us today.

Published with permission from TechAdvisory.org. Source.

Continue reading

Networks: Software-defined vs virtualized

If knowing is half the battle, virtualization is one for the ages. With more than a decade of history, it’s a tough topic that business owners would be hard-pressed to ignore. Over the years, the terminology has changed and capabilities have gotten even more confusing. If you’ve ever heard anyone use software-defined networking and network virtualization interchangeably, it’s time we set the record straight.

Software-defined networking (SDN)

Managing storage, infrastructures, and networks with high-level software is something IT technicians have been doing for a long time. It’s a subset of virtualization and it is one of the oldest strategies for optimizing and securing your IT hardware.

Despite its popularity, SDN does have one major drawback -- it needs hardware to do its job. SDN allows you to control network switches, routers, and other peripherals from a centralized software platform, but you can’t create virtual segments of your network without the hardware that would normally be required outside of an SDN environment.

Network virtualization

Evolving beyond SDN was inevitable. Whenever a technology can’t do it all, you can bet someone is working hard to fix that. Network virtualization uses advanced software solutions to allow administrators to manage physical hardware and to create virtual replicas of hardware that are indistinguishable to servers and workstations.

Network virtualization simplifies the field of network design. You can reduce spending on expensive hardware, reconfigure network segments on the fly, and connect physically separate networks as if they were in the same room.

A virtualized network may sound like an exciting technology that doesn’t have much use at small- or medium-sized business, but that’s exactly the beauty of hiring a managed services provider! We provide enterprise technology and advice as part of your monthly service fee. Call today to find out more.

Published with permission from TechAdvisory.org. Source.

Continue reading

3 Virtualization issues to watch out for

Although data storage is only one of the many ways to benefit from virtualized hardware, it’s still the most common use of the technology. Despite this popularity, virtualized storage is susceptible to a number of mismanagement catastrophes. We’ve outlined the three most common mistakes when utilizing this technology, right here.

Poorly structured storage from the get go

Within a virtualized data storage framework, information is grouped into tiers based on how quickly that information needs to be accessible when requested. The fastest drives on the market are still very expensive, and most networks will have to organize data into three different tiers to avoid breaking the bank.

For example, archived or redundant data probably doesn’t need to be on the fastest drive you have, but images on your eCommerce website should get the highest priority if you want customers to have a good experience.

Without a virtualization expert on hand, organizing this data could quickly go off the rails. Ask your IT service provider to see a diagram of where your various data types are stored and how those connect to the software-defined drive at the hub of your solution. If there are too many relays for your server to pass through, it’ll be a slower solution than the non-virtualized alternatives.

Inadequately maintained virtualized storage

How long will your intended design last? Companies evolve and expand in short periods of time, and your infrastructure may look completely different months later. Virtualized data storage requires frequent revisions and updates to perform optimally.

Whoever is in charge of your virtualization solution needs to have intimate knowledge of how data is being accessed. If you’re using virtual machines to access your database and move things around, they need to be precisely arranged to make sure you don’t have 10 workstations trying to access information from the same gateway while five other lanes sit unoccupied.

Incorrect application placement

In addition to watching how your data is accessed as the system shifts and grows, administrators also need to keep a close eye on the non-human components with access to the system. Virtualized applications that access your database may suffer from connectivity problems, but how would you know?

The application won’t alert you, and employees can’t be expected to report every time the network seems slow. Your virtualization expert needs to understand what those applications need to function and how to monitor them closely as time goes on.

Deploying any type of virtualized IT within your business network is a commendable feat. However, the work doesn’t stop there. Without the fine-tuning of an experienced professional, you risk paying for little more than a fancy name. For the best virtualization advice in town, contact us today.

Published with permission from TechAdvisory.org. Source.

Continue reading

The benefits of hyperconvergence

If you thought virtualization was confusing, wait until you hear about hyperconvergence. By consolidating a number of virtualization services into a single piece of hardware, that runs a single piece of software, small- and medium-sized businesses can enjoy the simplicity, cost effectiveness, and security of a cloud infrastructure, in one on-site “box.” If you love everything about cloud computing and virtualization, a hyperconverged infrastructure should be the newest tool in your toolbox.

Using a hyperconvergence model to structure your network is very representative of the current trends in small- and medium-sized business technology. It’s about making enterprise-level solutions more accessible to those looking for a smaller scale. So although a lot of these benefits sound like the same points we argue for other technologies, let’s take a look at how they are unique to hyperconvergence.

Software-centric computing

It may not sound huge at first, but by packing everything you need into a single box, and wrapping that box with a flexible and adaptable management software, you empower your hardware infrastructure to receive more regular patches and updates. This makes it much easier to add more hardware later, or restructure what you’re currently using.

Unified administration

Hyperconvergence consolidates a number of separate functions and services into one piece of technology. Whoever is managing your virtualization services can tweak storage, cloud, backup, and database settings and workloads from one place.

Streamlined upgrading

Different hyperconvergence “boxes” come in different sizes and capabilities. So all it takes to scale up is buying another unit based on your forecasted needs. If you’re in a place where all you need is a little extra, purchase a smaller upgrade. But when you’re expecting rapid growth, a bigger box will ensure your IT can expand with your business.

Stronger data protections

Complexity is the achilles heel of most networked IT. When a small group of people are trying to stay on top of a mounting pile of account management settings, malware definitions, and data storage settings, it’s hard to keep constantly probing cyber-attackers from finding a security hole. But with a hyperconvergence infrastructure, your virtual machines aren’t built by bridging a series of third-party services together -- it’s all one service.

Keep in mind that while hyperconvergence is simpler than most virtualization solutions, it’s not so simple as to be managed by in-house IT departments at more small- and medium-sized businesses. The benefit of a more unified virtualization solution when you already have a managed services provider is the speed at which your growth and evolution can be managed.

The better your technology, the faster we can make changes. And the faster we can accommodate your needs, the less downtime you experience. Call us today to find out more about a hyperconverged system.

Published with permission from TechAdvisory.org. Source.

Continue reading

Guide to large-scale AWS cloud migration

We’ll just go ahead and say it: cloud migration is a smart business move and we highly recommended it. The potential for greater efficiency, more manageable storage capacity, and cost savings are all but guaranteed. Virtualization, however, is not a walk in the clouds. It often involves a complex process that requires time and money, so if you’re considering a large-scale migration to Amazon Web Services, read on to be prepared.

Preparation for migration

  • Is everyone within the organization on board with this major move? Are your employees adequately equipped with knowledge about the cloud? And, since large-scale transfers involve big data, would your security framework be able to deal with potential security threats during the transition? Can your company handle the inevitable expenditure that goes with investing in the cloud? These are just some of the points you have to consider when preparing for large-scale migration.

Reasons for migration

  • One of the most compelling reasons to virtualize tech capital is the need to meet your business’s increasing demand for efficiency, which could lead to greater profitability. Other reasons could include change of organizational leadership or a shift in business structure that necessitates storage recalibration. Regardless of your reasons for migrating to the cloud, you as a business owner should have a clear understanding of why you’re doing it, and make sure everyone understands why it is so important.

Size of resources to be moved

  • Using Amazon Web Services’ cloud storage gives you the benefit of eliminating the costs of buying your own storage infrastructure and it introduces an element of anywhere-anytime access to your business’s data and/or applications. That said, you must consider how much you’ll be transferring, and use it as your basis for moving. Knowing the amount of IT resources you’re freeing up lets you allocate more cost-effectively and allows your technology staff to focus on more innovative pursuits.

Migration requirements

  • Which specific data, servers, or applications need to be migrated? Does your company need large-scale migration, or can it survive on moving only a small part of your resources to the cloud? Perhaps, a subsidiary could survive without having to be moved to the cloud. When migrating to the cloud, you’d be remiss not to think of these tiny details.

Impact to the business

  • Temporary downtime is something you have to be ready for. You might need more time or you might need to consider alternatives for the brief interruptions that come with migration, and of course budget can be a major factor in your decision to move. You can save your business from unnecessary obstacles by first assessing its ability to handle these situations.
Recalibrating the management of your technological resources for scalable storage solutions in a cost-saving platform is not without its challenges. Your business and its stakeholders’ call for greater efficiency cannot be ignored. After considering these factors for a large-scale migration, you might realize that despite a few minor bumps, the benefits to your organization will far outweigh the projected costs, and that there’s nowhere to go but up (in the cloud).
Published with permission from TechAdvisory.org. Source.

Continue reading

Containers Vs. VMs: performance variations

Virtual containers have incrementally increased the ability of users to create portable, self-contained kernels of information and applications since the technology first appeared in the early 2000s. Now, containers are one of the biggest data trends of the decade -- some say at the expense of the virtual machine (VM) technology that preceded them. Read on to find out some of the performance differences between containers and virtual machines, and how the two can work together for your business.

When it comes to the virtual world, containers and VMs are not all that different. The VM is a good option for those who need to use more than one operating system in the course of a business project, while containers serve those who are comfortable staying within a Linux or Windows operating system without deviating. There are performance advantages to using containers, although these are counterbalanced by organizational advantages derived from a VM system.

Performance Nuances

VMs and containers both work from a virtual platform; therefore, the differences in performance relate to how they are configured and utilized by the people who maintain them.
  • Faster startup time: Containers don't have as much to start up, making them open more quickly than virtual machines. While it may not seem revolutionary, this can be up to a few minutes per instance -- a cost that adds up to quite a bit over the course of a year or more.
  • Resource distribution: Containers only need to pull hardware resources as needed, while a VM requires a baseline of resources to be allocated before it will start up. If you have two VM processes running at the same time, this might mean two of the same programs are pulled up even if they aren't being used.
  • Direct hardware access: A VM cannot pull information from outside of itself (the host computer), but a container can utilize the host system as it runs. This may or may not matter depending on what your users are doing, but certainly puts a point in the container column nonetheless.
Although it appears that containers out-perform virtual machines in most areas, there are uses for the VM environment, particularly for a business on the rise. With a virtual machine you have a security advantage because each VM environment is encapsulated with its own operating system and data configuration; additionally, you are not limited to the use of one operating system.

Virtualization is an incredibly tricky solution to grasp in its entirety. New avenues spring up all the time to get more use out of its benefits, and it might be tempting to take a “wait and see” mentality. In reality, one of the best things about virtualization is how adaptable it is as a business solution. We suggest you get into the game as soon as possible; give us a call so we can tell you how.

Published with permission from TechAdvisory.org. Source.

Continue reading

Reviewing vSpace Pro 10

When it comes to doing business today, it is all about computers and virtual platforms. The idea of a virtual desktop or virtual machine has long been a major component of doing business and providing employees individualized access to the information and programs necessary to do their work. However, just as business changes, so must the virtual desktop. vSpace Pro 10 has been introduced as a change to the virtual desktop platform as it stands today. Get to know more about how vSpace Pro stands today and whether or not it could benefit you and your business.

The traditional way companies make it possible for multiple employees to use company and business systems is to provide all users with their own copy of Windows so they can install the program separately on their machines. However, this can be quite cumbersome. If a patch is required for Windows, each account will need to be individually accessed and updated. This can also be expensive for businesses, as they will need to purchase individual copies of Windows and other software.

The idea behind vSpace Pro 10 is to do away with this expensive and sometimes inefficient type of virtual desktop system. vSpace Pro 10 requires a company to purchase only one copy of Windows, which is housed on what is known as a host server. Only the virtual desktop will then be customized for an individual user.

There are many reasons this can benefit a business. First of all, the maintenance costs, time, and effort will be significantly reduced because you will deal only with one copy of Windows rather than several. The initial system costs will also be much lower than alternative options.

You could also potentially save on your energy bills, as you would need to operate fewer machines at once by hosting the core operating system and multiple virtual desktops in a single central location. The best thing about vSpace Pro 10 is how simple and easy it is to use and to operate once installed, and the initial costs and installation process are simple as well. The streamlined nature and efficiency of vSpace Pro 10 make it one of the best virtual desktop platforms available for businesses today. If you would like to know more or want to get started, contact us as soon as possible.

Published with permission from TechAdvisory.org. Source.

Continue reading

Virtualization containers 101

There is a trend toward the use of ‘containers’ as a virtualization strategy within the IT world. And it's one that seems to be gaining popularity. Virtual containers work in similar fashion to shipping containers, which have made transport of bulky goods uncomplicated and uniform. Every small- and medium-sized business owner needs to learn how containers work before choosing a virtualization solution, and we’ve collected all the necessary details right here.

Why are containers so popular?

Before the introduction of containers, virtual workstations and servers allowed users to access computing power and software delivered across a local network or the internet. This technology took cloud computing and web hosting a step further than did just software on a website, and it created entire desktop experiences over the internet. However, it is a tad inefficient since running one small application still requires an entire hosted desktop.

Containers guarantee developers that their software will run smoothly, regardless of what type of computer their end user is running.

How containers improve on virtual desktops

Containers operate quite differently because they only package applications and their minimal requirements into a deliverable package. This makes it possible to deliver several containers to several different users with a significantly smaller footprint on the machine hosting the service.

There are a handful of pieces of software that create and deliver containers, and the most popular is Docker. Before the release of Docker, containers had existed for some time, but they were complicated and difficult to manage. With the rise of popularity in virtualization services, software vendors gained significant resources to make friendlier and simpler container solutions.

Although containers have made big improvements in enterprise computing, virtual machines still have a role to play in select circumstances. In both solutions, older equipment can be reappropriated to utilize much bulkier software hosted in the cloud. All you need is an internet connection, and an experienced IT professional to help you set it up. If you’re interested in either virtualization or accessing your applications in a container environment, please contact us today.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

Microsoft Edge browser becomes more secure

Microsoft’s Edge browser has enhanced its security features with new virtualization protocols. By running the browser inside a virtual container, it keeps web content totally separate from the Edge browser and your hard drive. Although it's a much smaller scale than what we are used to seeing out of Microsoft’s virtualization strategies, this is a gigantic boost to Windows’s native internet browser.

Browsers are one of the most popular avenues for cyber-criminals to deliver their wares, and new security measures by Microsoft set out to reduce that risk significantly. In a first for internet browsers, Microsoft has burnt any potential bridges between malware and PC hard-drives. The new and virtualized Edge is only available for Windows 10, and administrators will be required to choose what runs inside, and outside of the container.

When enabled, malware cannot gain access to anything outside of the Edge browser. Think of it like reheating your leftover lasagna inside a covered container; when that gooey mozzarella tries to muck up the walls of your microwave, your tupperware ensures it stays clean. So in our case, the cheese is malware, and even if you download malware from an untrusted site, it cannot reach beyond the container that Edge uses to protect your files.

According to tests run by Microsoft, the Edge browser has the lowest chances of malware infection when compared to other browsers running on Windows. And that means a lot when you consider that when it comes to cyber-attacks, the default Windows browser is always the first target.

In addition to creating containers for limiting the exposure of workstations, any malicious data is deleted by resetting the virtual space after users are done with it -- not unlike tossing your dirty tupperware into the dishwasher after reheating last night’s saucy noodle goodness. Permanent cookies aren’t kept after the reset, and it’s impossible for malware to continue running without a space to do so. Every new session starts with a clear, clean browser.

For those new to the virtualization game, it may seem like running Edge in this environment could slow down the machine. But Microsoft has guaranteed a cutting-edge, extremely light burden when enabling the service. When your organization is looking for virtualization services, from creating all your desktops in a virtual, internet-based space, to simply making your browsing more secure with virtual Edge browsers -- there’s only one team to call. Pick up the phone and dial us today. You’re a short consultation away from a cheaper, safer IT infrastructure.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

VMware’s Project Goldilocks: what is it?

Almost every day, the virtualization industry takes a giant leap forward. Although this industry has been reserved for only the most technologically advanced of businesses over the years, it’s spreading like wildfire with advances in cloud computing. As engineers create virtual versions of hardware, storage, and even networks, digital architects are coming up with entirely new ways to design your IT framework. Today’s development comes in endpoint security, and we’ve got everything you need to know right here.

A virtual network is a way to connect two or more devices that aren’t physically linked by wires or cables. From the perspective of machines on a virtual network, they’re essentially sitting in the same room -- even if they’re on opposite sides of the globe. The advantages of this setup range from ease of management to reduced hardware costs. AT&T and Verizon have begun offering these services, and small- and medium-sized businesses have slowly begun to adopt them.

Meanwhile, another sector of the IT world has been making its own advances. Cutting-edge hardware firewalls are beginning to offer internal segmentation as a method of separating pieces of your internal network to keep them safe from threats that spread internally. The more segments you have, the safer your network is from poorly protected neighbors. But there are limits to how much capacity one of these hardware firewalls has for segmentation.

Virtualization giant VMware has taken notice and developed a prototype to combine these two services. In the hopes of unleashing ‘microsegmentation’ from the limits of physical hardware, Project Goldilocks will essentially create a virtual firewall for every virtualized application. When one of these applications is created or installed, it will come with a ‘birth certificate’ outlining every acceptable function it can perform. When making requests to the operating system, network, or hardware the application is installed on, Goldilocks will cross-reference the request with the birth certificate and deny anything that hasn’t been given permission.

Segmenting virtual networks and applying them to individual applications rather than entire networks or operating systems could revolutionize the market for endpoint security. Not only would it be easier to block malware infections, but those that made it through could be quarantined and terminated immediately because of the virtual nature of their location.

While virtualization may be a complicated state-of-the-art technology, all it really takes is a helping hand. With our full team of specialists, we’re ready to pull you into the next stage of your virtualized infrastructure. All you need to do is reach out us -- why not do it today?

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

VMware releases security patches

Sometimes technology solutions seem safer merely because they’re not widespread enough to be a lucrative target. Although increasingly popular, virtualization’s resilient protection protocols and low adoption rates tend to offset the cost vs. benefit considerations of creating an exploit. Or at least, that was the case. Late last month VMware announced an update to patch a gap that allowed attackers to compromise virtualized cloud infrastructures. We’ve compiled everything you need to know to protect yourself here.

Since its first software release in 2001, VMware has remained the leading provider of virtualization platforms, with most sources estimating double-digit leads in market share over the nearest competitor. By creating virtual environments stored on a network server or in a cloud environment, the company has given their clients the ability to create workstations, software, and even networks that can be utilized remotely. Fast forward to today, and VMware is working overtime to maintain its reputation by preempting software security vulnerabilities.

Obviously, when delivering any kind of specialized privileges over a network, adequate protection is of the utmost concern. In this case, two services for managing mobile clouds (vIDM and vRealize) were found to be vulnerable to exploits wherein users with minimal rights could cheat their way into full administrative privileges.

The security team at VMware elaborated that when executed in just one of the two services, this flaw would not be considered critical. However, when combined, it could pose an imminent threat to the security of your cloud infrastructure. To amend this oversight, ask your managed services provider or IT staff to update vIDM and vRealize to their most recent versions (2.7 and 7.1, respectively) as soon as possible. If this can’t be achieved in a realistic time frame, blocking port 40002 would act as a temporary workaround.

Sufficient security requires by-the-minute responses to the latest breaches and exploits. By partnering with us, you’ll never need to worry about checking in regarding patches or breaches you read about in the news. Instead, you’ll be hearing about them from us when we come around to install the updates. Choose the safe option -- contact us today with any of your virtualization needs or questions.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

Cloud-based application virtualization

Citrix is one of the biggest names in the virtualization sector. It currently services over 330,000 organizations, and by teaming up with Microsoft to expand its cloud-based software delivery, the company hopes to give that number a boost. While the news of this partnership does mean winding down one popular software as a service, a newer -- and hopefully better -- one is on its way. Keep reading to find out how this announcement affects your organization.

What Citrix’s XenApp already does is deliver applications to users via a variety of methods and pathways other than local installations. The process starts with the creation of server-stored software containers that allow the services an application provides to be delivered to your staff members from a centralized server. XenApp enables you to set rules and procedures for when and how these features can be accessed, and it creates a multitude of versions of the software that can be delivered to different operating systems, devices, and locations.

In a press release back in May, Citrix made a bombshell announcement that it would create cloud-based versions of all its virtualization packages using Microsoft’s Azure as the foundation. While the two companies have been closely aligned for decades, this is an enormous boost to both their reputations. Fast forward to today, and we’re seeing the first rays of sunshine from this new team-up.

And much more than simply lending Citrix the foundation, Microsoft will be directly involved in the development and release of the new cloud-based version of XenApp. The two companies have promised to work together to combine the simplicity and scalability of Azure with the administration and performance improvements of XenApp, thereby creating the most comprehensive software-as-a-service (SaaS) provider on the market.

Because Microsoft’s RemoteApp already acts as an Azure SaaS platform, the potential for conflict means it will be wound down to its eventual sunset in August 2017. But fear not; for faithful users of this service, Microsoft has promised a clear transition plan to reduce the possibility of growing pains.

Cloud-based XenApp is just the first of many improved services to be born out of the partnership between these two titans of tech. Rumors are swirling that XenDesktop will get the same treatment and a release won’t be far behind. Regardless, the tech industry is moving ahead with the virtualization of everything it can get its hands on, and it's time to jump on the bandwagon. When you’re ready to make the leap, our experts are ready to pull you aboard. Contact us today for answers to all of your virtualization questions.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

Telecoms offering network virtualization

With virtualization yet to make its way into the lexicon of common tech phrases, many business owners are still trying to decipher the full extent of its value. Various aspects of the service have evolved over time, and we can probably expect more to come. For now, however, one of its existing functions is getting a boost from the likes of AT&T and Verizon. Virtualized network services are complex and often difficult to understand, but their value is unquestionable. Let’s delve a little deeper.

The overarching theme of virtualization is combining hardware and software resources into one large, communal pool where individual servers and workstations can pull as much as they need, rather than allocating them inefficient individual puddles that either go dry, or unused. In a workstation model, this is realized in the form of minimally equipped endpoints that access much more powerful software and hardware resource pools via web browser.

When it comes to network virtualization, it’s quite similar to virtual private networks, or VPNs. Developments in cloud and software functionalities allow administrators to eliminate time-consuming and micromanaged VPNs while achieving the same swift and secure connections between servers, workstations, and other network-enabled devices residing in physically separate networks. VPNs are like pumps between each of your network ‘puddles,’ individual pieces that require individual maintenance. Network virtualization is like installing one pump with on/off switches on each outbound pipe.

This means that setting up a VPN for the addition of a satellite office is as simple as inputting a few simple pieces of information to gain a world of seemingly local network possibilities. In another example, rather than recabling your office when one department becomes too cumbersome to fit on one switch or hub, connections and protocols can be expanded and redefined by a software client.

With the increasing popularity of any service with the word ‘virtualized’ in it, telecom carriers Verizon and AT&T have begun offering this service to clients. Whether it's because your business has a growing list of locations, or because your local network needs the flexibility to grow swiftly without waiting for costly hardware and software expansions, these services have you covered.

Both Verizon and AT&T will offer three ways to manage your virtualized network: locally, from the cloud, or a hybrid combination of the two. Once you’ve decided on a framework and deployment strategy, make sure to take time with the transition. Established networks are complex and messy ordeals, and it’s better to start migrating those puddles over time rather than all at once. Instead, move them one-by-one so any problems that arise can also be dealt with one-by-one.

Although consumer-level companies like AT&T and Verizon are offering this service, not just anyone can hop on and start getting the most out of a virtualized network. It takes expert configuration, deployment, and most importantly maintenance. With 24/7 coverage of your network, we eat, sleep, and breathe cutting-edge technology. Call us today and we’ll bring your SMB into the age of virtualization.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

Amazon Web Services’ simpler data migration

Any business owner who has taken the time to truly understand how virtualization works knows that the final product is invaluable. But since nothing worth doing is ever easy, there are a lot of things that might scare SMBs away from making the leap. Amazon Web Services is working tirelessly to make that leap a lot easier and less frightening with their Data Migration Service. Let’s take a look at the latest development.

As a quick review, virtualization is best imagined by visualizing your server as a house. When a user draws computing power from your server, it’s a lot like opening the front door and just telling anyone to come in and grab whatever they need. The house gets crowded and messy quickly. Virtualization allows you to create doorways into partitioned rooms, with specifications and permissions unique to the user or application that needs them.

Much like the house in our analogy, the hardware and upkeep of servers can become quite expensive. By taking virtualization one step further, Amazon Web Services (AWS) has created the equivalent of a gigantic apartment building, online. When renting these internet-based apartments, your SMB is presented with virtualized versions of your server and desktops. You realize huge cost savings by eliminating upfront capital expenditures on hardware, and the rooms and their contents can be modified and adapted with little more than a simple request to AWS.

So you’re ready to migrate your server and clients to a virtualized environment, but after hiring experts you’re informed they will need days -- possibly weeks -- of server downtime to move your data from on-site storage to AWS. This service interruption has long been a massive speed bump in cloud migration projects. With Amazon’s Data Migration Service (DMS), that server downtime can be reduced to as little as 10-15 minutes.

This significantly reduced downtime is achieved by keeping your database live during the migration process. The final product can be stored in one of AWS’s several regional datacenters, or even copied back to your on-site server for concerns about redundancy and continuity.

Another speed bump along the road to your new virtualized home is moving from one database schema to another. Imagine the front door of your original, one-bedroom house is shaped like a triangle. But your destination, the AWS home, has a front door in the shape of a circle -- how will you get your data into its new home? Amazon’s DMS has added a new tool to take care of all of that for you. This means it doesn’t matter if you have an Oracle or MySQL on-site server; Amazon can almost effortlessly convert it to a new schema.

With such a valuable tool, AWS must be charging a fortune to utilize it, right? Wrong. Amazon promises DMS migrations will cost as little as three dollars per terabyte. Even if your business is hosting above average amounts of data on-site, that’s still a price tag any cash-strapped SMB can afford.

Just because there’s a new tool for the migration, doesn’t mean you should attempt buying a new home for your server and driving it across town alone. Think of us as your virtualization real estate agent and movers, all wrapped into one. For an inexpensive, swift migration to a virtualized environment, we’ve got just the place for you. Contact us today.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading

The 5 most popular virtualization platforms

Just understanding what office virtualization is can be difficult enough; picking from the long list of software providers that help you achieve it can feel impossible. Since virtualization is a relatively new practice for small and medium-sized businesses, there isn’t even a standardized way to go about virtualizing your company. So why not start with this simple list of the 5 most popular virtualization options and what their strengths are.

VMware

Any conversation about virtualization for small and medium-sized businesses usually starts around VMware. Although it wasn’t necessarily the first, VMware was the company that really put office virtualization on everyone’s action item list. The company offers a number of different solutions for different sized businesses with a wide variety of needs. Its ease of use and robust security features have secured its reputation as one of the best options for virtualization at SMBs.

Citrix

An average user may not recognize the company name, but has a good shot at previous knowledge of their popular remote access tools, GoToMyPC and GoToMeeting. Citrix has specifically geared their virtualization software, XenApp, XenDesktop, and VDI-in-a-box toward SMBs and even claims that non-IT staff can easily manage and administer the services. They even provide a free trial to prove it.

Microsoft

Although it may be a little more difficult to manage without an in-house or outsourced IT staff, Microsoft’s Hyper-V option is hard to ignore considering its integration with the popular cloud platform Azure. Whether you’re a Microsoft loyalist or you just want to minimize the number of vendors in your network, Hyper-V offers everything you need from a virtualization service.

Oracle

This company just keeps getting bigger and bigger. Specializing in marketing software, they also offer database management, cloud storage and customer relationship management software. If you’re using any of their services already, there could be benefits to enlisting their virtualization services as well. Oracle does everything, server, desktop and app virtualization, and they believe that consolidation of all of these into one solution is what sets them apart.

Amazon

And since we’re on the topic of household names, let’s talk about Amazon’s EC2 platform, which hosts scalable virtual private servers. The ability to scale and configure capacity is definitely EC2’s biggest draw for SMBs, who are preparing for the possibility of rapid growth. Although almost any virtualization service is rooted in scalability, Amazon is leading the pack in how quickly and finely you can adjust your solution to your individual needs.

Virtualization is a really hard topic for most SMBs to tackle. This list only covers the most popular vendors, and there are plenty more out there. Choosing one based on its application possibilities and management requirements is not a subject for the lighthearted. Get in touch with us today so we can break down all of the technobabble into easy-to-understand advice and expertise.

Published with permission from TechAdvisory.org. Source.

0 Comments
Continue reading