Apple Expands Reach on USB-C: What’s Next for Future Devices?

The USB Type C port on the new 2015 Macbook

The USB Type C port on the new 2015 Macbook

In 2015, we saw Apple relaunch what was the complete overhaul of their former flagship laptop line before making the switch to the Pro/Air series. The new Macbook, which included many new features that were relatively brand new to the laptop world, hit the shelves with its new Intel Atom processor, butterfly keyboard, and beautiful Retina display. This product was praised for its innovation in many areas, but what took the technological world by storm wasn’t any of those features, the display, or even debuting in 3 different finishes. Yes, the big discussion were its ports, or lack thereof.

This Macbook featured a single port, a USB type C port, opposite of the 3.5mm headphone jack on the other side of the computer. This notebook, the smallest and thinnest among its family in the Macbook line, left any other port besides these two off of the case and into the world of adapters. While many were left scratching their heads, Apple was not only selling many of these devices but also was praised for product innovation and debuting the next new type of USB to its devices.

Fast forward to Fall of 2016. Apple’s newest line of Macbook Pro’s are just announced, featured, and released. Amidst the new keyboard previously seen by the lower level Macbook and even the Touch ID touch bar sitting atop the keyboard, again the question brought up is this: What about it’s ports?

The new Macbook Pro w/ USB-C ports

The new Macbook Pro w/ USB-C ports

Now that Apple has brought the USB type C port to its higher end of Laptops (all of the Macbook Pro line), what can we expect from future devices? Will Apple learn from the likes of Google and Motorola and integrate the newest port to its iPhone and iPad (and iPod?) lines?

What’s next in line for an overhaul among Apple’s core devices is in fact the Macbook Air. Praised as the perfect everyday computer, it really isn’t needed for heavy usage and professional applications, but it is perfect for the average user and student for its longevity on the battery side of things and ease of use and efficiency fit into a small form factor. Featured already are its Magsafe 2 charging port, two USB 3.0 ports and a headphone jack.

We have already seen Magsafe 2, which was the hyped-up successor to the original charging port of the older Macbook Pros/Airs in the Magsafe 1 port, is now phased out on two of three Macbook lines. What is due up next is removing this port on the Air. This would be paving the way for the Air to upgrade to the next level of innovation and include a Type-C port for charging, fitting right in with its brothers in the line-up. While introducing this next port for charging, it is also lightning fast for data, so remove the other USB ports and you got yourself a Macbook Air, with multiple Type C ports and a headphone jack, along with the improvements in display and keyboard that should come with it.

But what does this mean? Is the adapter life going to consume us for the rest of time? That answer we do not know yet, but it is worth thinking about. For Apple and many companies that should follow suit, this is a huge market to breach in customers purchasing different dongles and adapters to hang like winding branches off their laptops. For the likes of HDMI, Thunderbolt, Ethernet, and many other ports very much still necessary in this day and age, will companies phase them out and stick to adapters forever, or will Apple learn from the adapter game and start to integrate these ports back into their devices, using these next models as sort of a “testing phase”?

For now, we’ll see where this brings us for the product releases in the spring and fall of 2017, but something to know is this: USB Type C among Apple devices is here to stay, and there’s no getting around it. Maybe we will see this dominate every device from the iPhone, iPad, Macbooks, and even maybe into desktop computers.

Amazon’s Echo and Alexa: A User’s Experience

Introduction:

Over the holiday break a new acquisition in technology took place in the Afonso household, we purchased an Amazon Echo Dot. At $50, the price seemed reasonable enough that it was worth a shot to try and get on the cutting edge of smart home technologies. Unfortunately, due to a lack of smart devices in our home, we were unable to use Alexa’s greatly touted integrations with things like the Nest Products or Zigbee based lighting products. However Alexa can be used for much more that controlling a smart home, I’ll speak to some Alexa Skills (the Echo’s version of Applications) that we tried and our experiences with them.

Built-In Functionality:

Out-of-the-box, the Echo can be easily configured for integration with a wide array of streaming media services. Built-in are Pandora, Spotify (restricted to premium accounts), iHeartRadio, and Tunein Radio. This makes a the Echo a perfect candidate for the smart radio, as it has a small speaker built-in (low fidelity mono speaker so another speaker is recommended) as well as Bluetooth connectivity to connect to larger and better audio equipment. Built-in news, weather, and sports integration which at setup time needs to be configured using the Amazon Alexa app (available for both iPhone and Android) There is also built-in smart device detection which I was unable to experiment with that detects smart devices and performs a pairing procedure that allows you to perform actions with keywords like (“Turn on, Turn off”) additional smart home skills are needed to perform in-depth control of other smart-devices.

The Skills:

These can be turned on for use with your device by stating “Alexa, enable *insert skill name here*”.

To use any skill, state “Alexa, open *insert skill name here*”.

To use a skill and pass it information, state “Alexa, ask *insert skill name here* to *insert parameter name here*”

Anymote Smart Remote:

After configuring the skill using the Anymote App for iPhone (instructions are openly available), I was able to control my Roku Smart TV device using my voice. Simply stating “Alexa, open Anymote” followed by the remote input you’d like to perform such as “Volume up”, “Home Button”, “Up Button” will interact with the device you’ve configured it with. Overall, a very useful skill for those looking to remote control any of their network connected devices.

Jeopardy J6:

This is a shortened version of the classic game show that allows the user to answer questions from a recent airing of the show. Alexa will provide responses with whether the question you provided to the corresponding answer was correct. The performance of this app was superb, and really allows for an interactive experience with Echo.

Twenty Questions:

This is another classic game that allows you to think of any object (limited to a category set) and within 20 Questions, Alexa will aim to guess what you’re thinking of. The interactivity with this app is also superb, and the shock when Alexa get’s those obscure guesses correct is pretty amazing.

Ooma Telo:

This skill is perfectly utilitarian – it allows the user to place a call via the Echo, with a caveat, the call must be completed using an existing phone line and cannot proceed via the Echo. Essentially, once you ask Ooma to place a call it will initiate a three-way call between it’s VOIP service and the phone you choose, thus it’s limited to initiating calls.

Drive Time:

This skill is the perfect companion for a commuter, since Alexa has no built-in time estimation for destinations, Drive Time allows you to ask for driving times to favorited locations that the user must configure. There is no search function, and locations must be stated before-hand in the skills settings in order to use them.

Experience Summary:

Alexa can do some really remarkable things, and at this point due to it being released about 2 years ago (2014), the Skills that have been developed allow functionality to be extended to a variety of platforms and devices. The Echo Dot does beg for a better speaker, but at the $50 price point, it’s expected, and also provides incentive for buying the Dot’s larger brother, just the standard Echo. The overall versatility of the connections on the device (3.5mm output, Bluetooth Connectivity, Wireless Connectivity, and Zigbee via Wi-Fi hub) make it perfect for controlling audio and other devices.

Gaming on a MacBook Pro

Despite what the average internet person will tell you, MacBooks are good at what they do. That’s something important to remember in a time where fanboying is such a prevalent issue in the tech consumer base. People seem eager to take sides; binary criticism removing the reality that machines can have both good and bad qualities. MacBooks are good at what they do, and they also have their disadvantages.

One of the things MacBooks aren’t good at (mostly due to their architecture) is playing games. If you’re looking for high-performance gameplay, Windows machines are objectively better for gaming. Despite this, there are plenty of games and workarounds that’ll still enable you to have fun with friends or in your dorm room after a long stressful day even on a MacBook.

Note: I’ll only be listing the methods and games I’ve personally found to work well. There are likely tons of games and methods that work great, but I haven’t tried yet.  While I’m aware you can always install Windows via Boot Camp, I’ll only be touching on methods and games that don’t require altering the OS or running a virtual machine. Below is a screenshot of my machine’s specs for reference.

Screen Shot 2016-11-18 at 10.37.22 AM

Actually Getting Games  

Do you like games? Do you like sales? Do you often fantasize about purchasing AAA games for prices ranging from Big Mac to Five Guys? Steam is the way to go. You can get Steam here, and I highly recommend you do. Steam is great because of its frequent sales, interface, and ability to carry over your purchases between machines easily. A good amount of Steam titles are supported on Mac OS, so if you’ve been previously using a Windows machine and have a huge library, you won’t have to repurchase all of your games if you switch to a new OS. You can also purchase some games off of the App Store, though the selection there is far smaller in comparison.

Configuration 

If you’re planning on playing an FPS on your MacBook, you’re likely going to want a mouse. A mouse is far more accurate and comfortable than a trackpad when it comes to interacting with most game interfaces. However, after plugging in your mouse you might find that it feels…weird. It accelerates and slows itself down sporadically and probably feels like it’s fighting you. No need to worry! This is a simple fix.

Screen Shot 2016-11-17 at 2.58.31 PM

First, launch Terminal and enter the following command:

defaults write .GlobalPreferences com.apple.mouse.scaling -1    

This will disable Mac OS’s built in scaling and allow you and your mouse to have healthy bonding time without it suddenly deciding to perform an interpretive dance in the style of the plastic bag from American Beauty.

Screen Shot 2016-11-17 at 3.02.36 PM

Another bonus piece of advice would be to go to System Preferences > Keyboard > and check the option to use the function keys without having to press the fn key. If you’re playing games that require usage of the function keys, you’ll find it easier to only have to hit one key vs having to take your hand off the mouse to hit two.

 

Finally, I recommend you keep your system plugged in and on a desk. Just like with most laptops, demanding processes like games can drain the battery faster than Usain Bolt can run across campus and make your laptop hotter than that fire mixtape you made in highschool.

Solo game recommendations

So, you’ve set up your mouse and keyboard, installed steam, and you’ve got some free time to play some games. What now? Well, not every game that is listed as “compatible” with Mac OS actually works well with Mac OS. Some games lag and crash, while others might run at a high frame-rate with no problems. Here are a few games I’ve found work well with my system. (Reminder: Performance may vary)

1.h a c k m u d

Screen Shot 2016-11-17 at 3.10.31 PM

“h a c k m u d” is a game that is set in a cyberpunk future where you’re a master hacker. This isn’t Watch_Dogs though. You’re not “hacking” by pressing a single button; rather, every single bit of code is typed by you. If you don’t know how to code, the game does an alright job at teaching you the basics of its own language (which is like a simplified mix of HTML and Java). The first hour of the game is spent locked in a server where you’ll have to solve some interesting logic puzzles. Once you escape the server, the game suddenly becomes a fully functional hacking MMO entirely populated by actual players. The game runs well on Mac OS, as it’s almost entirely text-based.

2.Pillars of Eternity

Screen Shot 2016-11-17 at 3.14.22 PM

Do you like classic CRPGs? If the answer is yes, you’ll probably love Pillars. It’s a CRPG that fixes a lot of the problems the genre faced during its golden age, while not losing any of its complexity and depth. The game runs well, though do expect a loud and hot system after just a few minutes.

3.SUPERHOT

Screen Shot 2016-11-17 at 3.11.59 PM

Do you often dream of being a bad-ass ninja in the matrix? SUPERHOT is a game where the central gimmick is that time only moves when you move. More accurately, time moves at a fraction of a second when you aren’t moving your character. This allows for moments where you can dodge bullets like Neo and cut them in half mid-flight with a katana. The game runs great, though your system will quickly get super hot (pun intended).

4.Enter the Gungeon

Screen Shot 2016-11-17 at 3.12.48 PM

Enter the Gungeon is a cute little rogue-like bullet hell where your goal is to reach the end of a giant procedurally generated labyrinth while surviving an endless onslaught of adorable little sentient bullets that want to murder you. The game is addictive and runs well, though one common issue I found was that the game will crash on startup unless you disable the steam overlay. It’s a shame though that you can’t enjoy the co-op feature…

…or can you?

MacBook Party 

Who wants to play alone all the time? This is college, and like a Neil Breen movie, it’s best enjoyed with friends by your side. Here’s a tutorial on how to set up your MacBook for some local gaming fun-time.

First things first, you’re going to want some friends. If you don’t have any friends installed into your life already, I find running “heystrangerwannaplaysomegameswithme.exe” usually helps.

Next, you’re going to want to get one of these. This is an adapter for Xbox 360 controllers, which you should also get a few of here. Plug in the USB adapter into your MacBook. Now, Mac OS and the adapter will stubbornly refuse to work with each other (symbolic of the fanboying thing I mentioned at the beginning of this post), so you’re going to have to teach them the value of teamwork by installing this driver software.

Once you’re all set, you should be able to wirelessly connect the controllers to the adapter and play some video games. One optional adjustment to this process would be to connect your MacBook via HDMI to a larger display so everyone can see the screen without having to huddle around your laptop.

Enter the Gungeon has a great two-player co-op mode. I’d also recommend Nidhogg and Skullgirls for some casual competitive matches between friends.

And there you have it! Despite what some very vocal individuals on the internet might tell you, it is possible to enjoy some light gaming on a Macbook. This is the part where I’d normally make some grand statement about how the haters were wrong when they said it couldn’t be done; but alas, that would merely be fueling a war I believe to be pointless in the grand scheme of things. Are we not all gamers? Are we not all stressed with mountains of work and assignments? Are we not all procrastinating when we should be working on said assignments? While our systems may be different, our goals are very much the same. And with that, I hope you find my advice helpful on your quest for good video games.

Best,

Parker

Basic Wi-Fi Troubleshooting on macOS

From time to time, you may find yourself in a situation where Wi-Fi isn’t working for you on your computer whilst on campus. This is a quick and basic guide to helping you getting back online.

Disconnect & Reconnect

This is the easiest method to execute. While holding the Option key, click on the Wi-Fi icon in the Apple Bar. You’ll see something like this:

menu

Click on “Disconnect from eduroam”, and the Wi-Fi icon will dim immediately. Seconds later it will reconnect, provided you are on campus where your computer is picking up eduroam. This will solve the majority of issues that are related to connectivity.

Deleting the Eduroam Profile

This will be a multi-step, but simple process. Begin by opening System Preferences, and click on Profiles button.

profile

In the Profiles menu, select the Eduroam profile, and hit the delete key on your keyboard.

profile_menu

The system will prompt you if you are sure you want to remove. Confirm the removal.profile_pop

Once the profile is removed, consult this article to set up Eduroam on your laptop. This method will solve a vast majority of authentication related issues, particularly after a password reset.

Rearranging the Order of Preferred Networks

There will be times that your computer, for one reason or another, is configured to connect to the UMASS network over the Eduroam network. Whereas the Eduroam network is secured, and does not require a log in each time you connect, the UMASS network is not secure, and will prompt for log in information, preventing usual network access.

To change this, first open System Preferences, and then click on Network.

network

Once in the Network menu, hit Advanced.

advanced

In the Advanced menu, under the Wi-Fi submenu, make sure that UMASS is underneath eduroam. This tells the computer to attempt to connect to eduroam before attempting to connect with UMASS.beforeafter

Hit OK, and close the menu. The computer may prompt you if you wanted to apply the settings – hit Apply.

Gathering information for UMass IT

If any of the methods above did not work, and our consultants are not able to resolve your issue over email, we may ask you for certain technical info, such as BSSID, IP address, and MAC address. Most of the info we ask can be easily retrieved when you click on the Wi-Fi icon while holding the Option key.

wifi_info

Hope this information was helpful!

PRAM and SMC Resets

Among quick fixes for many issues on a Mac are PRAM and SMC resets.Image result for smc reset

PRAM stands for parameter random access memory, which can contain settings such as speaker volume, screen resolution, startup disk selection, and recent kernel panic information. Performing a PRAM reset can fix a number of issues, such as wifi connectivity, drives not showing up or screens not adjusting properly. To do a PRAM reset, all that has to be done is turning the Mac device on and holding Command + Option + P + R until the machine chimes a second time. This process should be done for a longer amount of time on a Late 2016 MacBook Pro because it doesn’t have a startup chime. However, PRAM resets are actually a thing of the past. Today the majority of Macs in use; ones manufactured after 2008, actually primarily use NVRAM to store many of these settings. A PRAM reset and a NVRAM reset are mostly the same, it resets less volatile ram to default factory settings, fixing a number of potential issues. NVRAM stands for non-volatile random access memory and is reset in the same manner as PRAM is.

SMC stands for system management controller, this is only on Intel-based Macs. SMC resets reset this controller which is a part of the machine that deals with hardware and power management. This system management controller reset can fix problems dealing with the fans, lights, power and system performance.. There are a variety of ways to reset the SMC depending on the kind of Mac you’re working with. A desktop mac, such as a Mac Pro, Mac Mini, or iMac requires disconnecting the power cord from the machine, waiting 15 seconds, plugging it back in, waiting another 15 seconds and then turning the Mac back on. With a Mac laptop with a non-removable battery, shut the Mac down connect the Mac to its power adapter. Hold shift, option and control on the left side, then press the power button, release all keys and then turn the Mac on normally. For Mac laptops manufactured 2008 or before with removable batteries, turn the machine off, disconnect the power cable, remove the battery. Press the power button and hold for 5 seconds. Put the battery back in, reconnect its power cable, and turn the Mac back on.

The Advancement of Prosthetics

Whether it’s veterans, amputees, or those born with certain abnormalities, prosthetics have allowed millions to live ordinary lives and do things they never thought possible. The idea of prosthetics is not a new one however. Ever since Greek and Roman times in the B.C era, doctors were attaching wooden stumps to those without legs, arms, toes, etc. However, the technology behind prosthetics has only picked up in the recent 20th and 21st centuries.

But how does it work? When I think of moving my arms or my legs, I am physically able to do so. But what if I had a prosthetic? How in the world do I make this limb-shaped computer actually do stuff? Well the answer is quite impressive actually. This is definitely one of those medical practices that just makes me go “wait, we can do that?”

If you are a healthy individual, you are able to move your limbs due to electrical signals that your brain is sending through nerves towards your muscles. Your muscles receive this electrical signal and either contract or relax. But if I were to amputate, let’s say, everything below your right knee, where would that electrical signal go? The signal would still go along the nerve towards your lower leg, but it would hit a dead end and there would be no response or outcome. In order to make your newly attached prosthetic usable, some rewiring needs to be done inside the body.how-are-prosthetics-made

Doctors are able to perform what is called targeted muscle reinnervation – in this process, doctors redirect those electrical signals to another muscle in the body; the chest for example. Now, the nerves that once controlled your lower leg, would now contract your chest muscles. You’re probably thinking, how does contracting my chest help the fact that I’m missing part of my leg. This is valuable, because the electrical activity of these chest muscles can be sensed with electrodes and used to provide control signals to a prosthetic limb. The end result is that just by thinking of moving your amputated leg, you cause the prosthetic leg to move instead.

Even outside of the biological aspect of prosthetics, they are truly feats of engineering. Since no two human bodies are physically exactly the same, all prosthetics need to be specifically designed to each patient. A wide variety of materials are used to create the actual limb, including acrylic resin, carbon fiber, thermoplastics, silicone, aluminum, and titanium. To create a life-like appearance, a foam cover can be applied and shaped to match the real limb. A flexible skin-like covering will be applied over the foam to give it the life-like appearance.

Prosthetics have given millions the opportunity to live a normal life and the technologies behind prosthetics is only getting better. Newer technologies allow people to move their prosthetic limbs without having any invasive surgeries or neural rewiring. The future is here, let’s just make sure we don’t all turn into robots.

Forget About It!: How to forget a network in Windows 10

Sometimes, it’s better to just forget!

One of the most common tropes in the tech support world is the tried and true “have you tried turning it off and turning it back on again?”. Today, we’ll be examining how we can apply this thinking to helping solve common internet connectivity issues.

While it’s one of the best things to do before trying other troubleshooting steps, “forgetting” your wireless network is not a step most people think to do right away. Forgetting a network removes any configuration settings from your computer and will cause it to longer try to automatically connect to it. This is one way to try to fix configuration settings that just didn’t get it right the first time.

Today, we’ll be examining how to “forget” a network on Windows 10 in four quick, easy steps!

  1.  Navigate to the settings page and select “Network & Internet” settings
  2. Select “Wifi” from the left menu, then select “Manage known networks”.settings2
  3. Find your network, click on it, then select the “Forget” button.settings3
  4. Open up your available networks, and try to reconnect to the network you would usually connect.

settings4

And that’s it!

While this may not solve connectivity issues, it is a good place to start. May this quick tutorial help you troubleshoot wireless problems you may have. If issues persist, you should next try to examine potential service outages, your network card, or, in the case of home networks, your modem/router.

 

What is Data Forensics?

Short History of Data Forensics

The concept of data forensics was created in the 1970s with the first acknowledged data crime seen in Florida, 1978, where deleting files to hide evidence became considered illegal. The field gained traction through the 20th century with the FBI creating the Computer Analysis and Response Team quickly followed by the creation of the British Fraud Squad. The small initial size of these organizations created a unique situation where civilians were brought in to assist with investigations. In fact, it’s acceptable to say that computer hobbyists in the 1980s and 1990s gave the profession traction, as they assisted government agencies in developing software tools for investigating data related crime. The first conference on digital evidence took place in 1993 at the FBI Academy in Virginia; it was a huge success, with over 25 countries attending, it concluded in the agreement that digital evidence was legitimate and that laws regarding investigative procedure should be drafted. Until this point, no federal laws had been put in place regarding data forensics, somewhat detracting from its legitimacy. The last section of history takes place in the 2000s, which marks the field’s explosion in size. The advances seen in home computing during this time allowed for the internet to start playing a larger part in illegal behavior, as well as more powerful software both to aid and counteract illegal activity. At this point, government agencies were still aided greatly by grassroots computer hobbyists who continued to help design software for the field.

Why is it so Important?

The first personal computers, while incredible for their time, were not capable of many operations, especially when compared to today’s machines. These limitations were bittersweet, as they limited the illegal behavior available. With hardware and software continuing to develop at a literally exponential rate, coupled with the invention of the internet, it wasn’t long before crimes increased with parallel severity. For example, prior to the internet, someone could be caught in possession of child pornography (a fairly common crime associated with data forensics) and that would be the end of it; they would be prosecuted and their data confiscated. Post-internet, someone could be in possession of the same materials, however they could now be guilty of distribution across the web, greatly increasing the severity of the crime, as well as how many others might be involved. 9/11 sparked a realization for the necessity for further development in data investigation. Though no computer hacking or software manipulation aided in the physical act of terror, it was discovered later on that there was traces of data leading around the globe that pieced together a plan for the attack. Had forensics investigations been more advanced than they were at the time, a plan might have been discovered and the entire disaster avoided. A more common use for data forensics is to discover fraud in companies, and contradictions in their server system’s files. Investigations as such tend to take a year or longer to complete given the sheer amount of data that has to be looked through. Bernie Madoff, for example, used computer algorithms to change the origin of the money being deposited into his investors’ accounts so that his own accounts did not drop at all. In this case, more than 36 billion dollars were stolen from clients. That magnitude is not uncommon for fraud of such a degree. Additionally, if a company declares bankruptcy, it can often follow that they must submit data for analysis to make sure no one is benefiting from the company’s collapse.

How Does Data Forensics Work?

The base procedure for collecting evidence is not complicated. Judd Robbins, a renowned computer forensics expert, describes the sequence of events as following:

The computer is first collected, and all visible data – meaning data that does not require any algorithms or special software to recover – copied exactly to another file system or computer. It’s important that the actual forensics process not take place on the accused’s computer in order to insure no contamination in the original data.

Hidden data is then searched for, including deleted files or files that have been purposefully hidden from plain view and sometimes requiring extensive effort to recover.

Beyond simply making invisible to the system or deleting files, data can also be hidden in places on the hard drive that it would not logically be. A file could possibly be disguised as a registry file in the operating system to avoid suspicion. This kind of sorting the unorthodox parts of the hard drive can be incredibly time consuming.

While all of this is happening a detailed report must be updated that keeps track of not only the contents of the files, but if any of them were encrypted or disguised. In the world of data forensics, merely hiding certain files can lead to an accusation of probable cause.

Tools

Knowing the workflow of investigations is useful for a basic understanding, but the types of tools that have been created to assist investigators are the core of discovering data, leaving the investigators to interpret the results. While details of these tools is often kept under wraps to prevent anti-forensics tools from being developed, their basic workings are public knowledge.

Data Recovery tools are algorithms which detect residual charges on the sectors of a disk to essentially guess what might have been there before (this is how data recovery works too). Reconstruction tools do not have a 100% success rate, as some data could be simply too spread out to recover. Deleted data can be compared to an unsolved puzzle with multiple solutions, or perhaps a half burnt piece of paper. It’s possible to only recover some of the data too, and therefore chance comes into play again as to whether that data will be useful or not.

We’ve mentioned previously the process of copying the disk in order to protect the original copy. A Software or Hardware Write tool is in charge of copying the disk, while insuring that none of the metadata is altered in the process. The point of this software is to be untraceable so that an investigator does not leave a signature on the disk. You could think of accidentally updating the metadata as putting your digital fingerprints on the crime scene.

Hashing tools are used to compare one disk to another. If an investigator were to compare two different servers together with thousands of gigabytes of data, it would take years and years to go through to look for something that may not even exist. Hashing is a type of algorithm that simply runs through one disk piece by piece and tries to identify a similar or identical file on a different one. The nature of hashing makes it excellent for fraud investigations as it allows the analyst to check for anomalies that would indicate tampering.

Though many other tools exist, and many are developed as open source for operating systems such as Linux, these are the fundamental types of tools used. As computers continue to advance, more tools will inherently be invented to keep up with them.

Difficulties During Investigations

The outline of the process makes the job seem somewhat simple, if not a little tedious. What excites experts in the field is the challenge of defeating the culprit’s countermeasures that they may have put in place. These countermeasures are referred to a ‘Anti-Forensics’ tools and can range as far in complexity as the creator’s knowledge of software and computer operations. For example, every time a file is opened the ‘metadata’ is changed – metadata refers to the information about the file, not what’s inside it, regarding things such as last time opened, date created and size – which can be an investigator’s friend or foe. Forensic experts are incredibly cautious to not contaminate metadata while searching through files, as doing so can compromise the integrity of the investigation; it could be crucial to know the last time a program was used or a file opened. Culprits with sufficient experience can edit metadata to throw off investigators. Additionally, files can be masked as different kinds of files as to also confuse investigators. For example, a text file containing a list of illegal transactions could saved as a .jpeg file and the metadata edited so that the investigator would either pass over it, thinking a picture irrelevant, or perhaps open the picture to find nothing more than a blank page or even an actual picture of something. They would only find the real contents of the file if they thought to open it with a word processor as it was originally intended.

Another reason data is carefully copied off the original host is to avoid any risk of triggering a programmed ‘tripwire’ so to speak. Trying to open a specific file could perhaps also activate a program to scramble the hard drive to avoid any other evidence being found. While deleted data can be recovered, a process called ‘scrambling’ cannot. Scrambling the disk rewrites random bits to the entire drive. Overwriting data is impossible to undo in this case, and can therefore protect incriminating evidence. That being said if such a process occurs it offers compelling reason to continue the investigation if someone has gone to such an extent to keep data out of the hands of the police.

Additionally, remote access via the internet can be used to alter data on a local computer. For this reason, it is common practice for those investigating to sever any external connections the computer may have.

Further, data forensics experts forced to be meticulous, as small errors can result in corrupted data that can no longer be used as evidence. More than just fighting the defendant’s attempt to hide their data, analysts fight with the law to keep their evidence relevant and legal. Accidentally violating someone’s rights to data security can result in evidence being thrown out. Just with any legal search a warrant is needed and not having one will void any evidence found. Beyond national legal barriers, the nature of the internet allows users to freely send files between countries with ease. If information is stored in another country, it requires international cooperation to continue the investigation. While many countries inside NATO and the UN are working on legislation that would make international data investigations easier, storing data around the globe remains a common tool of hackers and other computer criminals to maintain anonymity.

Looking Forward

Data security is a serious concern in our world, and will grow in importance given our everyday reliance on digital storage and communication. As computer technology continues to advance at the pace it is, both forensics and anti-forensics tools will continue to advance as more advanced and literate software is developed. With AI research being done at research universities across the world, it is quite possible the future forensics tools will be adaptive, and learn to find patterns by themselves. We already have learning security tools such as Norton or McAfee virus protection for home computers which remember which programs you tell it are safe and make educated guesses in future based on your preferences. This is only scratching the surface of what is capable from such software, leaving much to be discovered in the future. With the advancement in software comes the negative too, with more powerful resources for cyber criminals to carry out their operations undetected. Data Forensics, and information security as a whole, then, can be seen as a never ending race to stay in front of computer criminals. As a result, the industry continues to flourish, as new analysts are always needed with software advances taking place every day.

CPU Overclocking: Benefits, Requirements and Risks

The Benefits of Overclocking

Overclocking is, essentially, using the settings present on the motherboard in order to have the CPU run at higher speeds than what it’s set to run by default. This comes at the cost of increased heat production, as well as potential reduction of lifespan, though for many people the benefits far outweigh the risks.

Overclocking allows you to basically get ‘free’ value from your hardware, potentially letting the CPU last longer before it needs an upgrade, as well as just generally increasing performance in high demand applications like gaming and video editing. A good, successful overclock can grant as much as a 20% performance increase or more, as long as you’re willing to put in the effort.

Requirements 

Overclocking is pretty simple nowadays, however, there are some required supplies and specifications to consider before you’ll be able to do it. For most cases, only computers that you put together yourself will really be able to overclock, as pre-built ones will rarely have the necessary hardware, unless you’re buying from a custom PC builder.

The most important thing to consider is whether or not your CPU and Motherboard even support overclocking. For Intel computers, any CPU with a “K” on the end of it’s name, such as the recently released i7-7700k, will be able to overclock. AMD has slightly different rules, with many more of their CPUs being unlocked for overclockers to tinker with. Always check the specific SKU that you’re looking at on the manufacturer’s website, so you can be sure it’s unlocked!

Motherboards are a bit more complicated. For Intel chips, you’ll need to pick up a motherboard that has a “Z” in the chipset name, such as the Z170 and Z270 motherboards which are both compatible with the previously mentioned i7-7700k. AMD, once again, is a bit different. MOST of their motherboards are overclock-enabled, but once again you’re going to want to look at the manufacturer’s websites for whatever board you’re considering.

Another thing to consider is the actual overclocking-related features of the motherboard you get. Any motherboard that has the ability to overclock will be able to overclock to the same level (though this was not always the case), but some motherboards have built in tools to make the process a bit easier. For instance, some Asus and MSI motherboards in particular have what is essentially an automated overclock feature. You simply click a button in the BIOS (the software that controls your motherboard), and it will automatically load up a fairly stable overclock!

Of course, the automatic system isn’t perfect. Usually the automated overclocks are a bit conservative, which guarantees a higher level of stability, at the cost of not fully utilizing the potential of your chip. If you’re a tinkerer like me who wants to get every drop of performance out of your system, a manual overclock is much more effective.

The next thing to consider is your cooling system. One of the major byproducts of overclocking is increased heat production, as you usually have to turn up the stock voltage of the CPU in order to get it to run stably at higher speeds. The stock coolers that come in the box with some CPUs are almost definitely not going to be enough, so much so that Intel doesn’t even include them in the box for their overclockable chips anymore!

You’re definitely going to want to buy a third party cooler, which will run you between 30-100 dollars for an entry level model, depending on what you’re looking for. Generally speaking, I would stick with liquid cooling when it comes to overclocks, with good entry level coolers like the Corsair h80i and h100i being my recommendations. Liquid cooling may sound complicated, though it’s fairly simple as long as you’re buying the all-in-one units like the Corsair models I mentioned above. Custom liquid cooling is a whole different story, however, and is WAY out of the scope of the article.

If you don’t want to fork over the money for a liquid cooling setup, air cooling is still effective on modern CPUS. The Coolermaster Hyper Evo 212 is a common choice for a budget air cooler, running just below 40 bucks. However, air cooling isn’t going to get you the same low temperatures as liquid cooling, which will not let you get as high of an overclock unless you want to compromise the longevity of your system.

The rest of the requirements are pretty mundane. You’re going to want a power supply that can handle the higher power requirement of your CPU, though to be honest this isn’t really an issue anymore. As long as you buy a highly rated power supply from a reputable company of around 550 watts or higher, you should be good for most builds. There are plenty of online “tier-lists” for power supplies; stick to tier one or two for optimal reliability.

The only other thing you’ll need to pick up is some decent-quality thermal compound. Thermal compound, also called thermal paste, is basically just a grey paste that you put between the CPU cooler and the CPU itself, allowing for more efficient heat transfers. Most CPU coolers come with thermal paste pre-applied, but the quality can be dubious depending on what brand the cooler is. If you want to buy your own, I recommend IC Diamond or Arctic Silver as good brands for thermal compound.

Risks

Overclocking is great, but it does come with a few risks. They aren’t nearly as high as they used to be, given the relative ease of modern overclocking, but they’re risks to be considered nonetheless.

When overclocking, what we’re doing is increasing the multiplier on the CPU, allowing it to run faster. The higher we clock the CPU, the higher voltage the CPU will require, which will thus produce more heat.

Heat is the main concern of CPUs, and too much heat can lead to a shorter lifespan for the chip. Generally speaking, once you’re CPU is consistently running at above 86 degrees Celsius, you’re starting to get into the danger zone. Temperatures like that certainly won’t kill your CPU immediately, but it could overall lower the functional lifespan.

For most people, this won’t really be an issue. Not many people nowadays plan on having their computer last for 10 years and up, but it could be something to be worried about if you do want to hold onto the computer for awhile. However, as long as you keep your temperatures down, this isn’t really something you need to worry about. Heat will only outright kill a CPU when it exceeds around 105 degrees Celsius, though your CPU should automatically shut off at that point.

The other main risk is voltage. As previously mentioned, in order to achieve higher overclocks you also need to increase the voltage provided to the CPU. Heat is one byproduct of this which is a problem, but the voltage itself could also be a problem. Too high voltage on your CPU can actually fry the chip, killing it.

For absolute safety, many people recommend not going above 1.25v, and just settling for what you can get at that voltage. However, most motherboards will allow you to set anything up to 1.4v before notifying you of the danger.

My personal PC runs at 1.3v, and some people do go as high as 1.4v without frying the chip. There really isn’t a hard and fast rule, just make sure to check out what kind of voltages people are using for the hardware you bought, and try to stick around that area.

Essentially, as long as you keep the CPU cool (hence my recommendation for liquid cooling), and keep the voltages within safe levels (I’d say 1.4v is the absolute max, but I don’t recommend even getting close to it), you should be fine. Be wary, however, as overclocking will void some warranties depending on who you’re buying the CPU from, especially if the CPU ends up dying due to voltage.

Afterthoughts – The Silicon Lottery

Now that you understand the benefits of overclocking, as well as the risks and requirements, there’s one more small concept; the silicon lottery.

The silicon lottery is the commonly used term to describe variance in CPU overclocks, depending on your specific CPU. Basically; just because you bought the same model of CPU as someone else doesn’t mean it will run at the same temperatures and overclock to the same point.

I have an i7-7700k that I’m cooling with a Corsair h100i v2. I am able to hold a stable 5ghz overclock at 1.3v, the stock settings being 4.2ghz at around 1.2v. However, not everyone is going to achieve results like this. Some chips might be able to hit 5ghz at slightly below 1.3v, some might only be able to achieve 4.8 at 1.3v. It really is just luck, and is the main reason that overclocking takes time to do. You can’t always set your CPU to the same settings as someone else, expecting it to work. It’s going to require some tinkering.

Hopefully, this article has helped you understand overclocks more. There are some risks, as well as some specific hardware requirements, but from my perspective they’re all worth the benefits.

Always remember to do your research, and check out a multitude of overclocking guides. Everyone has different opinions on what voltages and temperatures are safe, so you’ll need to check out as many resources as possible.
If you do decide that you want to try overclocking, then I wish you luck, and may the silicon lottery be ever in your favor!

The Touch Bar may seem like a gimmick, but has serious potential

macbook-pro-touch-bar-customize-100690194-origThe first iPhones came out in 2007. At that time people had Blackberrys and Palm PDAs – phones that came physical keyboards and a stylus. These iPhones were immediately praised for its aesthetics, but criticized for its limited functionality. As development that expanded functionalities of these iPhones took off, so did the phone itself. After wrestling the market with traditional styled PDAs, iPhones and Androids began leaving its competition in dust.

Jump forward to today. The new MacBook Pros now come with a touch strip (marketed as Touch Bar) in place of the function keys that reside in the first row. While they haven’t gone away, Apple decided that a touch strip would enable a more dynamic style of computing. Of course, Apple detractors look at this as a sign that Apple is running out of ideas and resorting to gimmicks.

I recently got my hands on one of these MacBook Pros, and yes, there are obvious shortcomings. Though the computer is beautifully engineered and designed, it’s questionable that the Touch Bar itself isn’t high definition (or retina display, as Apple would’ve marketed it). As far as using it, it does feel a little weird at first, since you don’t get a tactile response as opposed to using any other key on the keyboard, but I’ve gotten used to it. There are also some minor design flaws that might be of annoyance, such as the volume and brightness adjustment bar not being the most intuitive, the fact that I’ve managed to press the power button a couple times when I meant to use the delete key, and that some functions that the Touch Bar is largely advertised for are sometimes buggy, particularly when scrubbing through a video – so much for Apple’s reputation when it comes to quality control.

But it’s obvious to see why Apple might envision the Touch Bar as the next evolution in laptop computing. It’s clear that they don’t believe in a laptop/tablet hybrid ala the Surface Pro – not even Microsoft themselves are buying into it as much. But the dynamism that the Touch Bar offers, or perhaps more importantly, has the potential of offering, is way more appealing. And though the Touch Bar may seem limited in terms of functionality and usefulness, it’s a little like the original iPhone: a lot of it depends on the software development that follows.

Bored? Kill some time on the internet!

You’re sitting in the airport waiting for your flight, and that flight is hours delayed! Beyond that, you’ve watched all your Netflix shows, and fiddled with all the apps on your phone, and now you’re bored! What do you do in this situation?

Luckily, the internet offers almost endless websites with which to kill time! Some of them are educational, some of them are silly, with many others falling somewhere in between.

The first place many people will go when they’re bored and on the internet is, of course, reddit. Reddit is a forum of sorts which markets itself as “the front page of the internet.” It is fairly simple to understand. The site is subdivided into different boards called “subreddits,” usually denoted by “/r/[subreddit name]” after reddit.com. There is a vast quantity of subreddits, one for almost any area of interest.

There is a subreddit for pictures, for news, for gaming, and pretty much anything else! There is one for any TV show you might watch (for instance this one on Game of Thrones), or for TV in general. There are also some rather silly or interesting ones kicking around, such as this one dedicated to flat earth enthusiasts, or this one dedicated to jokingly treating life as though it is a game (check it out, it’s pretty funny), or this one called “shower thoughts.”

Even UMass has a subreddit (though admittedly it’s not terribly active)!

What makes this especially cool, is that it means there is a community for almost anything you might be interested in. For instance, if you saw the new Harry Potter movie and have tons of theories you want to discuss, you might want to go to the Harry Potter subreddit, and you will find that you can have interesting conversations with tons of strangers who are into it! If you are really into history, you can have your questions answered by historians.

Finally, one thing that often drives new users away from the website is its appearance. Reddit was launched in 2005 and has not functionally changed much since then. Thus, it can be a bit of an eyesore when one first gets to it. Have no fear though, once you get used to the site’s quirks, it becomes much more manageable.

Reddit uses an up vote/downvote system to allow users to help sort which content is seen first by other users. Once you register for an account, you can use the arrows on the left side of posts to vote yourself!

At the top of the page, there is always a way to sort the board you are looking at, usually “new,” “top,” “best.” These methods allow you to sort which content you are looking at.

Finally, one thing that is almost essential to browsing reddit is the Reddit Enhancement Suite. This is a chrome extension which makes some of the more annoying features of the website less annoying. We have a whole blog post which goes over using RES!

SU-MIMO vs. MU-MIMO

 

If you’ve been in the market recently for a new router, gateway, or access point, you may have noticed the terms “SU-MIMO” or “MU-MIMO” being tossed around as a hot new feature. But what exactly do these terms means? And how important are these to making my wireless better?

To understand how these new wireless technologies work, it’s important to first understand how wireless traditionally works. Wireless, whether it be in a home, dorm, or office is a shared medium, many different wireless devices are all “talking”, sending and receiving packets of data, to a single piece of wireless antenna hardware at a time. Traditionally, access points could only send or receive packets to one device at a time. The wireless hardware can send and receive packets so fast however, that many times you may not even notice a slowdown when loading your favorite website or watching a Netflix stream, even when it’s shared among others in a household.

However, when you start to add more users on a single wireless antenna, and/or add latency specific network packets such as a skype call, or loading an online multiplayer games, in which packets are very dependent to be sent and received with urgency, or otherwise the user might encounter “lag”. This sort of lag occurs when packets aren’t being able to be sent or received from the devices fast enough, as the hardware can only send or receive to one device at a time, and has to switch around from device to device sending or receiving their packets, while others are put on a shared waitlist.

In come SU-MIMO and MU-MIMO, a way to help alleviate lag on an overcrowded network. The MIMO part stands for Multi-Input, Multi-Output, with the SU and MU standing for Single-User and Multi-User respectively. These wireless standards work to alleviate the traditional wireless bottleneck, the ability to only send or receive packets from one device at a time. With SU-MIMO, a single device can now both send AND receive packets at a time, with MU-MIMO being able to send and receive data from multiple devices at a time, effectively working to cut down on the waitlist to send data to devices requesting data on a network.

MU-MIMO is the future of wireless technology: it costs very little as a standard to implement, and has a noticeable impact on a clogged network without having to modify anything else. Devices now must wait a lot shorter of a time to get and send their packets, and overall creates a more efficient wireless network. You will have to make sure your next home or enterprise’s installation include support for MU-MIMO wireless equipment, but it should also be noted that it won’t solve all your problems. Wireless is and always will be a shared medium, they can be dependent on others factors such as wavelength congestion, interference, and pure overcrowded-ness if your equipment was not rated for the workload.

USB C- The New Standard

We all know the frustration of trying to insert a USB plug into our computers or phones, flipping it over and trying again, only to find we were actually right the first time. The problem is that both sides of a USB plug look the same has plagued us all. Apple had fixed this problem with their proprietary Lightning connector which was omnidirectional (i.e. it could be inserted on either side). But this connector could only be found on Apple devices. The cell phone industry in particular was in need of a new solution.

Think back to the first cell phone you had. For many, this was in the early 2000’s (or even earlier), and think about what kind of charger you used. Most phones came with a charger that plugged into the wall, and they were largely incompatible with other phone brands. As cell phones became more popular, manufacturers decided to settle on one standard plug type that could be used by any phone. Enter the Micro USB Type B:

MicroB_USB_Plug

This plug should look familiar if you have an Android phone, but it has also become almost ubiquitous as the connector for most portable devices including cameras, tablets, and smart watches. A notable exception to this standard is Apple who has used proprietary connectors since the introduction of the iPhone.

However, since 2007 when the micro-USB was introduced, the needs of the industry have changed. Data speeds have gone faster, devices require more power, and the need to connect more types of devices calls for a new type of universal plug for all. This is the motivation behind the development of the USB-C connector.

USB_Type-C_macbook

The USB-C is omnidirectional like Apple’s Lightning connector so there won’t be any more trouble with which way the plug is facing. They support the USB 3.1 data standard, which can transfer up to 10 Gigabits per second (Gbps). Possibly most interesting, USB-C is capable of carrying 100 Watts of power, meaning that in addition to becoming the primary data connector type, it could also replace most power cords in the near future.

Also of note is Apple’s adoption of the USB-C connector. They have redesigned their newest version of the Thunderbolt connector to be USB-C compliant, meaning that one can use a USB-C cable, but get Thunderbolt 3 speeds of 40 Gbps. It is worth emphasizing that although the connector on Apple’s Thunderbolt 3 is the same as USB-C, the data transfer specification and therefore transfer speed is not the same. However, Apple’s adoption of the new plug type in conjunction with the rest of the industry means that USB-C is is a truly universal connector.

With its higher data speeds, power support, and ease of use, USB-C is getting ready to be the port trend all ports. Indeed, the USB Implementers’ Forum (the organization responsible for USB specifications) has declared that they intend USB-C to be “future-proof”. As technology demands increase over the next few years, we will see if this new connector can live up to its promises.

Digital Audio Compression: a Cautionary Tale

Enter: Audio

Form the beginning, people all around the world have shared a deep love of music.  From early communication through drumming, to exceedingly complex arrangements for big bands, music has been an essential part of human communication.  So it makes sense that people would push to find a way to record music as performed by the musician.

The advent of Tin-Pan Alley in New York initially produced only sheet music – the recording of music on paper, readable only by trained musicians – but it was the invention of Thomas Edison’s phonograph which altered the history of music the most. With the phonograph, sound could be recorded from the air.  Early phonograph systems employed a light diaphragm, often in the shape of a horn, suspended on springs and attached to a needle which would carve into a rotating cylinder of wax.   These records were made without any electric power and have a distinct sound which often wavers in tone character and pitch.

Thomas Edison with early Phonograph

Thomas Edison with early Phonograph

In the 1920s, the RCA company developed the first electrical audio recording system which employed microphones and a magnetic medium on which to record electrical impulses.  The microphone could vibrate more easily than the phonograph horn and could therefore reproduce sound in a more natural and consistent manner.

After a brief push to standardize the format for recorded music, Edison’s wax cylinder gave way to the disk-shaped Gramophone record, which eventually gave way to the two formats we are now familiar with: the 33 rpm vinyl record, for full-length albums, and the 45 rpm vinyl record, for single songs.  Now obsolete, these formats allowed for exceedingly accurate reproduction of recorded sound for half a century.

 

Audio goes Digital

In the mid-twentieth century, Claude Shannon determined that information could be encoded into discrete values, transmitted, and decoded to reconstruct the original information.  The human voice could be transmitted, not as an analog electrical representation of a sound-wave, but as a series of numbers representing the magnitude and polarity of the sound-wave at given times (samples).  This method of encoding information into discrete values allowed for the more efficient use of limited resources for transmitting and storing information.

Claude Shannon: Father of Information Theory

Claude Shannon: Father of Information Theory

For music, this meant that sound could be broken down into only the magnitude and polarity of the sound-wave, at a given time, and represented as a binary number.  With music now a series of 1s and 0s, and with far less information encoded into these 1s and 0s than with the analog formats, music could be stored on small, plastic disks called Compact Disks.  These compact disks, or CDs, contained an unadulterated digital representation of the original analog sound-wave.  The CD, though sometimes thin and clinical-sounding, offered some of the most detailed reproduction of sound in history.

Analog-Digital frequency examples

 

Digital Streaming and the Push for Compression

In the 1990s, online music sharing services birthed a new format for recorded music: the mp3.  The mp3, though a digital representation of an analog sound-wave like a CD, made feasible downloading music over the internet.  It did so using a simple principle: use a fourier transform to view the sound-wave as a series of sine waves of different frequencies and remove all the frequencies with low magnitudes.

Napster, an early music sharing service

Napster, an early music sharing service

While this technique allowed for digital audio files to be much smaller, it had the nasty effect of sometimes removing some important sonic character.  Though insignificant sounds in a piece of music could be a small amount of background hiss, it could also be the timbre of a musical instrument.  For this reason, mp3 files would often lack the fullness and detail of the CD or analog formats.

 

The world of Recorded Music, Post mp3

Though internet speeds have vastly improved since the early days of music streaming, compression has not gone away.  The unfortunate nature of the problem is that one powerful computer can run a compression algorithm on a digital audio file in less than a second and then stream it to a user; streaming lossless digital audio, like the digital audio from a CD, requires the rebuilding of physical infrastructure to handle the extra data.  From a profit-generating stance, compression is cheap and effective; most people don’t notice if audio is compressed, especially if they are listening to music on computer speakers or poorly constructed headphones.

However, there are some out there who do notice.  Though services like iTunes and Spotify do not offer the purchasing of music in non-compressed formats, music in these formats can sometimes be purchased from lesser known services.  CDs are also still in production and can be purchased for most new albums through services like Amazon.  Some may also be surprised to learn of the prevalence of the analog formats as well; for most new releases, a new copy can be purchased on a vinyl record.  Since most music is recorded directly to a digital format, vinyl records are made from digital masters.  However, these masters are often of the highest quality as there is no need to conserve space on the analog formats; digital audio sampled at 44.1 kHz and 96 kHz both consume the same amount of space on the surface of a vinyl record.

12cm_vinyl_cd_wallets_sm_4

So what is the answer for those looking to move beyond the realm of compressed music?  Well, we could all write in to Spotify and iTunes and let them know that we will only purchase digital audio sampled at 96 kHz with a 24-bit word-length…but there may be a simpler way.  CDs and vinyl records are still made and they sound great!  If you have an older computer you may be able to listen to a CD without having to purchase any extra equipment.  For faithful reproduction of the CD’s contents, I would recommend a media player like VLC.  Additionally, if you have grandparents with an attic, you may even have the necessary equipment to play back a vinyl record.  If not, the market for the analog formats seems to be getting miraculously larger as time goes on so there is more and more variety every day for phono equipment.  There’s also always live music and no sound reproduction medium, no matter how accurate, can truly capture the spirit of an energetic performance.

So however you decide to listen to your music, be educated and do not settle for convenience over quality when you do not have to!

Netflix On the Go for the First Time Ever

Netflix: everyone’s favorite source for TV shows and movies–old classics, new favorites, even new versions of old classics. It’s incredibly convenient to have an entire library of videos to watch–at least until you find yourself watching Friends at 2 am the night before an exam. The only inconvenient thing about Netflix? It’s only available as a streaming service.

Until now.

On November 30th, 2016, Netflix announced that it would, for the first time, make select movies and TV shows available for download. This feature has been a long time coming, and people have been calling for it for a long time. Some competitors have had it already. Amazon Prime, for example, has allowed Prime members to download Prime videos to Amazon devices for at least two years.

Netflix is, as far as I can tell, being pretty generous about the download feature. It’s going to be available for members of both the lower-priced subscription and the higher-priced subscription–they aren’t using it to bait users into paying for the more expensive subscription. Also, Netflix is including movies and shows that aren’t Netflix originals–crowd-pleasers such as Parks and Rec or Sixteen Candles.

Of course, everything has a limitation. Netflix’s is the actual quality of the picture. The shows and movies currently available for download have two options: a lower-quality, faster-download option, and a higher-quality, slower-download option. The catch is that the higher quality isn’t as good as the highest quality stream. The other limitation is, of course, the actual movies and shows that are available for download. The selection is currently still very limited; items such as Friends and Bob’s Burgers aren’t available. Netflix did state that more shows are forthcoming.

The conclusion? Netflix has made a step in the right direction, but they’ve still left room for improvement.

AMD’s Ryzen from Intel’s Shadow

The AMD Ryzen is AMD’s newest processor in the making, scheduled for a Q1 release in 2017.

The goal with their newest chip was to match the performance of Intel’s current flagship i7, the Core i7-6900K, and develop a chip that has 40% more instructions per clock than their own previous CPU. Here’s the line up to compare.

As you can see, the turbo boost and max are currently unknown as of now, as well as the price. The TDP however is 95 watts. Even the base frequency in this picture isn’t set in stone, so it may change before the official release.

AMD debuted their new 8-core chip using Blender’s open source application, which showed that it matched and actually slightly surpassed the performance of it’s Intel counterpart. The builds of the computers housed. However, this is only one benchmark test, and an open source benchmark that would allow for recompilation for greater performance on the AMD chip. To quell doubts, AMD later showed more advanced test such as running Hardbrake encode to transcode video and the ZBrushCore benchmark that supported AMD’s original claim. Surprisingly during these benchmark tests, the boost mode on the Ryzen processor was turned off but it still performed faster than the Intel i7.

Lets talk about overclocking the processor. The Ryzen supports overclocking in increments of 25 mhz, which is smaller than the traditional 100 mhz increment on most processors. This means that the processor can be tuned to much more specific clock speeds. In addition to more smaller increments, the new Ryzen processor comes with Extended Frequency Range (XFR), which allows the processor to sense how efficiently it’s being cooled and further overclock itself. This allows for the processor to surpass regular boost clock speed maxes. Another interesting piece of software that runs on the new processor is it uses Neural Net Prediction which helps the processor predict future pathways an application will utilize based off of learning from past application usage.

All in all, AMD’s Ryzen processor may very much hold the future of AMD in how it does on the market. The engineering team at AMD has finally come up with a truly competitive processor to help gain back the market they’ve been losing to Intel over the last decade. AMD has surpassed even their own expectations in terms of manufacturing but it’s up to the consumers after release to truly determine how well the chip will hold up against Intel. It’s also important to note that the Intel card being tested here is an Intel Broadwell Gen 5, a “tick” in Intel’s CPU fabrication. This chip is a generation behind the current Kaby Lake and will soon be two generations behind with the release of the Sky Lake “tock” chips. Although the current price is unknown, I will definitely be considering switching to AMD because of this new card.

 

Amazon Prime: An Analysis

Amazon Prime: How it can Work for You

Amazon Prime is essentially Premium Amazon, and it comes with a variety of services. The full price of Prime is $10.99/month; but students get a 6-month free trial (which is pretty generous for free trial times) and then a 50% discount to the subscription price.

 

Amazon Prime Shipping, aka The Reason Everyone Gets Prime

If you’ve ever used Amazon, you’ve probably seen the little Prime symbol, probably accompanied by the words “free two-day shipping for Prime members.” Needless to say, this service is a fantastic bargain, especially when, say, you lose your headphones and have to order a new pair.

This is probably the Prime feature that lures the most customers in, mostly because it’s the easiest to find; the other features are a lot more hidden. Unless you pay for music or movies or TV shows AND buy them from Amazon, you probably won’t run into the other features as much. Regardless, free two-day shipping is an integral part of the Amazon experience, and, if you’re lucky, you may run into a few items with free one-day shipping, or even free same-day pickup.

 

Amazon Prime Video, aka The Thing People Find After Getting Prime

Amazon Video is probably the second most popular Prime feature. This is Amazon’s way of competing with Netflix; they have Amazon original TV shows, and a moderate selection of movies and regular TV shows.

One of the interesting features that Amazon implemented is that every so often they will make pilot episodes available to watch and then Prime members can vote for which show they want Amazon to make. This is called the Amazon Pilot Season, and I personally find it to be one of the most interesting features Amazon has implemented, as it allows the viewers to be more involved with the process, and it keeps the shows’ futures in the hands of the customers, to some extent.

Another unique feature of Amazon Video is how Amazon pairs it with their devices. If you have a Kindle Fire, you can actually download Prime movies and TV show episodes to watch offline. Of course, the Kindle Fire as a tablet has its own flaws, but this feature in and of itself is pretty cool.

 

Amazon Prime Music, aka The Hidden Gem of Prime

As Prime Video is Amazon’s Netflix competitor, Prime Music is Amazon’s Spotify competitor. While Prime Music doesn’t have as extensive a selection as Spotify does, I have found that it actually has a few songs that Spotify doesn’t. Also, I have personally found that Amazon’s app tends to work better than Spotify’s; the shuffle seems to actually shuffle your music around more, and the albums are alphabetized by album title, as opposed to artist, as Spotify has it.

 

Amazon Prime Reading, aka Amazon’s Library Feature

This feature is great for any avid reader who is looking for something new to read without breaking the bank, or anyone who wants to get into reading. This feature provides a decent amount of books for free to Prime subscribers; of course, you have to either have a Kindle, or the Kindle app installed on your phone, computer or tablet, and the book selection isn’t always the best. However, it’s still a great way to discover new titles or jump in to reading without paying tons of money for books.

 

Amazon Prime Pantry, aka Get Food Delivered to Campus

This feature isn’t really as much of a benefit to college students, unless you find yourself drowning in instant ramen cups. Essentially, Prime Pantry allows subscribers to purchase food items in normal sizes, as opposed to the bulk sizes that non-subscribers are limited to. I know I would personally prefer the bulk sizes, but sometimes you don’t need to pick up twelve boxes of tissues and carry them back to your dorm.

 

Amazon Prime Photos, aka Just When You Thought Prime Couldn’t Have Any More Features

Finally, Amazon Prime has a separate app, much like its Music app, for photo storage. Because, technically, you are a paying customer, you get unlimited storage, and you can have the app automatically back up every picture on your device. You can also arrange your pictures into albums, or share them with people, or put them in what Amazon calls the “Family Vault.”

The Family Vault is a place where you and your family members can, if you so choose, share your photos with one another. You can add all of your photos or pick and choose which ones you want to share. This is a great feature for parents who love to see their kid’s photos and want the opportunity to use said pictures in photo albums or other places.

Prime Photos also has a face tagging system where it automatically separates photos out by a person’s face, and you can rename this semi-album–this allows you to view all pictures of one person at any time. There is also a feature to hide photos and videos, so they can only be accessed from the “Hidden Photos and Videos” folder.

 

So is it worth it?

In my opinion, Amazon Prime is 100% worth it. For one thing, the 6 free months are a great deal and a great way to try out the features for yourself. For another, $5.50 a month really isn’t very much to pay, if you consider all the features you get with that money. It’s less than a dollar per service! What a great deal!

AT&T Joins the Online Video Streaming Game

With the increasing popularity of online video streaming and the vast decline of tradition TV packages, it’s no wonder why AT&T wants to get in on that market share. Like most companies trying to break into a new market, AT&T is starting with very low prices. If you sign up now or shortly after the service starts, you’ll be getting 100 channels for $35 a month. Any time after that, and $35 a month will only grant you access to 60 channels in total. At 5x the monthly cost of Netflix, you’d hope to be getting some more value out of this new service, and from the channel lineup it looks like you do. Signing up for 3 months of service at the start will get you a free Apple TV and Siri remote, while signing up for one month will instead get you a free Amazon Fire Stick. Premium channels like MTV, Oxygen, NHL Network, FXM, GOLF, and NBA TV are all included in the 100 channel package. If you want to keep up on Game of Thrones on HBO, you can do that too, but you’ll be adding an extra $5 to your monthly cost. One of the big draws to this service is that they offer “100-plus premium channels … [not] the junk that nobody wants” says AT&T director Randall Stephenson. This claim is bolstered by the addition of more channels provided with their recent acquisition of another major cable company, Time Warner. While this all seems well and good, there are a few downsides to AT&T’s new service. Like any streaming service, there is a limit to the amount of people who can watch, and the limit here is pitifully low at only two concurrent users. This would prove difficult for a four person family if each member wanted to watch their own shows. There are also some vague issues with availability for local channels and regional live sports channels. No set top box is needed for the connection, and all you need is a broadband internet connection to get started, so this is an attractive offering for college students.

Side Effects of Virtual Reality

With the recent explosion of virtual reality in the tech world, many scientists are somewhat worried about the possible effects of long term VR use. While there are many potential benefits of VR, such as helping people suffering from PTSD and/or depression. There are also drawbacks, which due to how new the technology is, haven’t been adequatly studied and explored. These effects can range from something as simple as motion sickness (VR sickness) to people having disassociate experiences and even losing there sense of presence in reality. Due to the boom in popularity, many studies are now being performed to try and determine the extent of these effects; unfortunately, it remains quite difficult to get an accurate idea of the long term effects because of the age of the technology. While there is no doubt that some of these effects are somewhat dramatized, just like how people thought microwaves give you cancer, there is still a real threat of long term effects that may only be realized when it is too late.

As a gamer and a techie, I will be the first to admit that VR opens doors for technologies that were once the things of movies, and it is incredibly exciting and almost surreal. Although I also feel it should be approached with some degree of caution. An experiment on rats in virtual reality showed that certain parts of the brain, specifically in the hippocampus, would either shut down or behave erratically. The effects of VR on the eyes is also something to take into consideration, a professor at UC Berkley has been studying “vergence-accomodation conflict” which causes discomfort in the eyes when using VR. It is caused by the eyes needing to remain focused at the same distance (where the screen is), while the distance you’re eyes needs to converge or diverge at is changing (this is how the 3D effect of VR is generally achieved) which causes stress within the eyes.
VAConflictOverall, I am just as excited as everybody else about the endless possibilities of VR. I just want people to approach it with an air of caution, as it is still a new and relatively un-researched technology that could have some unforeseen effects.

Good at Pictionary? Try Quick, Draw!, from Google!

It usually takes two (or more) players to play a game of Pictionary, or “guess what I drew”. Now, thanks to Google, it only takes one!

Introducing Quick, Draw!, an A.I. experiment created by Google designed to teach itself how to recognize drawings and match them to words. Today, I’ll be providing a very basic walk through on how the program works. If you saw the title and are just interested in learning about it yourself, skip to the bottom of this article for the link. MAIN SCREEN 1

Quick, Draw! is a neural network, an AI program designed to use machine learning to learn and remember the information it receives, so that it can better recognize it in the future. In Quick, Draw!’s case, it asks users to draw a picture of something to the best of their ability. Once the AI program is able to sufficiently recognize the picture, it moves on to the next one. If it cannot guess the picture within 20 seconds, it simply moves on to the next one. This is the first screen that will show up after starting:

BRIDGE

This is the screen that appears right before it asks you to draw whatever it asks for. Here, it wants us to draw a bridge. You will then be asked, to the best of your ability, to create your best interpretation of a bridge. If the computer can recognize it, you’ll automatically move to the next one!

Once you have completed (successfully or unsuccessfully) all six of your pictures, you will come to a results screen where you can analyze and see what other people drew for each picture:

results

From here, you can click on each picture to see what other people drew and what the computer recognized.

BRIDGE 2

Here we see that the AI was able to recognize the bridge, as well as showing what other words or items it may have thought you were drawing. Rainbow and fish were a couple of the other potential matches!

BRIDGE 3

Finally, if you scroll down, it will show you what other people drew that were successfully interpreted. As you draw, the Google AI uses a database of these pictures to try and help it identify what you are trying to draw. Naturally, the more pictures it has in it’s data bank, the “smarter” it will become, and the faster it will be able to detect what you are trying to draw. With some pictures, you can draw as little as two lines before it can tell what you are trying to draw!

Neural network AI is a newer technology that is just starting to get its legs. Earlier this year, Tay.AI, a Twitter neural network bot created by Microsoft, was “taught” to be incredibly rude in less than 24 hours, thanks to user input. As time goes on and the technology improves, we will begin to see a whole variety of uses for this type of technology.

It’ll only be a matter of time before the “Skynet” becomes a very real possibility.

To get started with Quick, Draw!, head over to https://quickdraw.withgoogle.com/ to give it a try.

Navigating Mac OS X Through the Command Line (Part 2)

 

Part I is available here.
Onward!

So last time we learned a few basic commands: ls, cd, and open.  That will get us through about 75% of what we would normally use the Finder for, but now we are going to address the other 50% (no, those percentages are not a typo).  In this article, I will address the following tasks:

-Copying and Moving

-Performing actions as the Super-User

-And a few little other things you may find interesting

Without wasting too much text with witty banter, I am going to just get right into it. However, I need to address one quick thing about the names of files and directories.  In part 1, we traveled through directories one at a time but in order to make things more quick and easy, we will have to do multiple directories at once.  How do we do this?

Remember in part 1 when I said the / symbol would become an important part of the directory name?  Well this is how it works: cd /directory1/directory2/directoryn.  If you have three directories with these same names on your machine and if directory2 is within directory1 and directoryn is within directory2, then you will have changed from your current directory directly to directoryn; bypassing directory1 and directory2 in the process.  Try it out with some of the directories on your machine.  Let’s say you wish to change directory from your root directory from your desktop; simply type in your cd command followed by /users/YOUR_USER/Desktop substituting the name of your user for YOUR_USER.  You should have just changed directory from the root directory to your desktop!

Alright!  Now that we can represent directories in a more intricate way, we can explore the more complex tasks that the command line is capable of!

 

Copying and Moving

If I could take a guess at the number one action people perform in the Finder, I would guess copying and moving files and directories.  Unless you know the position of every particle in the universe and can predict every event in the future, you’re probably going to need to move things on your computer.  You accidentally save a file from MATLAB to your Downloads directory and want to move it before you forget it’s there.  You just 100% legally downloaded the latest high-flying action flick and you want to move it from your Downloads directory to your Movies directory.  Additionally, you may want to create a new copy of your Econ paper (which you may or may not have left until the last minute) and save it to a thumb drive so you can work on it from another machine (#LearningCommonsBestCommons).

These tasks all involve moving files (or entire directories) from one directory to another.  The last task involves both duplicating a file and moving the newly created duplicate to a new directory.  How do we do this in the Finder?  We drag the file from one directory to another.  How do we do this in the Terminal?

  1. To move files we use mv
  2. To copy files we use cp 

Here is the basic implementation of these two functions: mv file location and cp file location .  In practice, however, things look just a bit different.  I will give you an example to help show the syntax in action and I will try to clearly explain the presence of all text in the command.  Let’s say we have a file in our root directory call GoUMass.txt and we can to move it to our documents folder so we can open it later in TextEdit or Vim and write about how awesome UMass is.  To move it in terminal we would type:

mv /GoUMass.txt /users/myuser/Documents/

After typing this in, if we ls /users/myuser/Documents, we would see GoUMass.txt in the contents of the Documents directory.  Need another example?  Let’s say we get cold feet and want to move it back to the root?  Here’s what we would type:

my /users/mysuser/Documents/GoUMass.txt /

So now that we know how to move, how do we copy?  Well, luckily, the syntax is exactly the same for cp as it is for mv.  Let’s say instead of moving GoUMass.txt from the root directory to documents, we want to copy it. Here is what we would type:

cp /GoUMass.txt /users/myuser/Documents/

Nice and simple for these ones; the syntax is the same.

One issue though: if you try to move an entire directory, then you will get a nasty error message.  To make this not the case, we employ the recursive option of mv and cp.  How do we do this?  After the command (same as it is written above) we provide one more space after the destination name and write -r.  This -r tells the computer to go through the mv or cp process as many times as it needs to get your directory from point A to point B.  Anyone who has taken a data structures course may recognize the word “recursion” and be able to see why it is implemented here.

You may also get a different nasty message here about permissions, the we will deal with in this next section:

…oh, and one last thing: if you want to delete a file, the command is rm.

 

Performing Actions as the Super-User

Permissions are a nasty thing in the computer world and can really hold you back.  The Finder will often deal with it by prompting you to enter in you administrator password.  Mac users who have configured a connection to Eduroam on their own (which I hope most of you have) will have had to do this an annoying number of times.

The Terminal will not pop up and ask you for your password, you have to tell it when you are about to do something which requires special permissions.  How do you do this?  You use a command called sudo.  sudo stands for super-user do and will allow you to do nearly anything that can be done (provided you are acting as the right super-user).  This means that those clever things that the folks at Apple Computers put in the Finder to prevent you from deleting things, like your entire hard drive, are not there.  For this reason, you can mess up some really important things if you use sudo, so I caution you.

So how does sudo work syntactically?  There are two things you can do with it: you can preface a command with sudo, or you can use sudo su to enter the super-user mode.

Prefacing a command is simple, you type sudo before whatever it is that you wanted to do.  For instance, let’s say our GoUMass.txt needed administrator privileges to move (unlikely but possible).  We would type in our move command the same as before but with one extra bit:

sudo mv /GoUMass.txt /users/myuser/Documents/

After you type this is, your computer will prompt you for your password and you will enter it and press return.  Do not be alarmed that nothing shows up in the command line when you press a key, that’s normal.  If you enter the correct password, then your computer will do the thing you asked it to after the sudo command; in this case, it’s mv.

You can also invoke actions as the super user using sudo su.  The su command will lock in the sudo privileges.  The syntax for this is as follow:

sudo su

That’s it!  After this, you will be prompted for your administrator password and then you are good to go.  The collection of cryptic text prefacing your command will change after you enter sudo su; this is normal and means you have done things correctly.  In this mode you can do anything you would have needed the sudo command for without the sudo command; sudo mv becomes just mv.

 

And a few little other things you may find interesting

The command line can be used for a wealth of other tasks.  One task I find myself using the command line for is uploading and downloading.  For this, I use two different apps called ftp and sftp.  Both do the work of allowing the user to view the contents of a remote server and upload and download to and from the server.  Sftp offers an encrypted channel when accessing the server (the ‘s’ stands for secure) and has the following syntactical structure to its command:

sftp username@server.address

If you server requires as a password, then you will be prompted for one.  Once you’re logged in to your server then you can use commands like get, mget, put, and mput to download and upload respectively.  It will look something like:

mget /PathToFileRemote/filename.file

mput /PathToFileLocal/filename.file

 

Wondering if your internet is working?  Try using the ‘ping’ command!  Pick a website or server you would link to ping and ping it!  I often use a reliable site like Google.com.  Your command should look something like:

ping www.google.com

You should start getting a bunch of cryptic-looking information about ping time and packets.  This can be useful if you are playing an intense game of League of Legends and want to know you ping time (because you are totally lagging).  The main use I find for the ping command is to see if the wireless network I’m connected to is connected to the internet.  Though rare, it is possible to have full wifi reception and still not be connected to the internet; ping can test for this.

 

Feel like you can do anything in the Terminal that you could do in the Finder?  Want to add the ability to quit the Finder?  Here’s what you type:

defaults write com.apple.finder QuitMenuItem -boolean true

Follow this command with the following:

killall Finder

Your machine will glitch-out for a second but when things come back online, you will have a cool new ability in the Finder: command-Q will actually quit the Finder.  Here’s what you loose: Finder windows and your desktop.  This is the fabled way to improve your computer’s performance through your knowledge of the command line.  Finder uses more of your computer’s resources than the Terminal does so substituting one for the other can help if your computer gets hot often or runs slow.

Remark: if your computer is running outrageously slow, try running an antivirus scan, like Malwarebytes, or checking to make sure your drive isn’t failing.

 

In Conclusion

The command line, once you get a grip on some of the less-than-intuitive syntax, is an invaluable tool for using any computer system.  For everyday tasks, the command line can be faster and for slightly beefier tasks, the command line can be the only option.

And for those still in disbelief, I implore you to try installing a package manager, like Homebrew, and installing some applications to your command line.  If you can think of it, it’s probably out there.  My personal favorite is an application called ‘Links’ which is a text-based internet browser for the Terminal.

The command line, on any system, is one of the most important tools for navigation and operation.  Even for those who do not want to become one with the compute, the command line can really come in handy sometimes.

Next Generation Consoles: Will you make the Switch?

I’ve been playing video games for about as long as I can remember. If you’re experience was anything like mine, your first gaming experience was with a Nintendo, or maybe your mom just called your console a Nintendo. The first game I remember playing was Tetris on the Game Boy Color, and now Nintendo is working on their next release.

In late October Nintendo announced their next big release with the following video.

The Nintendo Switch is taking the next step forward in console gaming. The Switch is a unique hybrid of a powerful home console and a portable gaming system. As seen in the video, the multiplayer capabilities of the console are also available while using it as a handheld console.

Two of the games prominently featured in the video were The Legend of Zelda: Breath of the Wild and the remastered version of The Elder Scrolls V: Skyrim. Both games are open world games, and boast large map sizes. The Legend of Zelda is a title owned by Nintendo, but Switch is being developed in partnership with Bethesda Game Studios, the developers of Skyrim and the Fallout series. In addition to Bethesda, Nintendo partnered with Activision, Capcom, Electronic Arts, Havok, Konami, SEGA, THQ, Ubisoft, and other game developers.

Nintendo Switch is predicted to launch in March of 2017, but price information is not currently available. For more information, you can visit the Switch wepage athttps://www.nintendo.com/whatsnew/detail/first-look-at-nintendos-new-home-gaming-system

iOS 10 and Mac OS Sierra

Apple recently released its latest computer and mobile operating systems, Mac OS Sierra and iOS 10, both bring improvements and changes that are sure to please some and upset others.  

iOS 10 looks and feels very different with a new lock screen, notification center, and control center and updates to apps like Messages and Music, but as with any software updates there are bugs including a bug where trying to update the machine over the air cause it to crash and boot loop(although this has been fixed since then). This is just a reminder that day 1 software may have bugs and it is never a bad idea to hold off a week or two for all the issues to be ironed out.  

Mac OS Sierra brings along a new naming scheme, gone is OS X and in is Mac OS, some are speculating that this is a way to bring iOS and Mac OS closer, although for now they look to be separate and distinct OS’s.  Continuity however bridges the gaps between iOS and Mac OS, with a universal clipboard between apple devices, and now Siri is in Mac OS.

iOS

iOS 10 brings many small but convenient updates and changes. The slide to unlock (present since the first release of iOS) has been replaced by pressing on the home button to unlock. Many of the stock applications have received a couple of new features here and there, and a lot of focus went into tying separate applications to make for a single fluid system. One such example of this focus on context driven capabilities is transferring events, and contacts found in emails to calendar and contacts respectively. Messages also received a considerable change with the addition of new stickers, and animations to emphasize the expression of your texts.

One trade off of iOS 10 is the implementation of API’s (Application Program Interface) to help the programmers design applications. One API was removed that allowed low level access to information about hardware; such as the battery cycle count or charge voltage. But Apple implemented three new APIs allowing for more access to iMessage, Maps, and Siri which allow for integrating new tasks such as restaurant booking from maps or larger and emotive emoticons and emoji’s.  

 

macOS

macOS Sierra borrows many of its new features from iOS. From the integration of Siri to the adoption of Apple Pay online, the influences of iOS are clear.  Disk Utility has the option to setup raid arrays, a feature that was removed in 10.11. While not a widespread use case, it is nice to have the ability to set up special storage options. In stark contrast Apple has really limited the security options for Gatekeeper. Whereas before if you were trying to install a program from an unidentified developer (like Cloudpath) all you had to do was change a setting in System Preferences to accept all 3rd party apps, now you have to run the application, and then go into system preferences to give it explicit access to run, additionally you can also right-click it and hit “open” to provide the necessary permission to launch the application. In some cases it might be a few extra clicks, but the general security improvement is a interesting trade-off to consider.

iCloud also has a new trick, which affects Mac OS. Older files (such as documents/voice memos) can be moved to iCloud to save space on the machine. This allows Apple to charge for iCloud server space and use smaller storage solutions in machines, a formula popularized by Google’s Chromebooks. This optional feature is designed to free up space on machines as an alternative to using external drives.

Sudo also changed how it works, if you authenticate one tab in Terminal if will not authenticate the other tabs, meaning that you have to re-authenticate the sudo command in each tab.  

Summary 

Overall the iOS 10 and macOS upgrades do not provide any world changing features, but what they do provide is many smaller changes that should make devices easier to use and more feature rich, from the changes to apple music to the new lock screen, this update feels like a step closer to a “perfect” experience. Looking to the future, Apple released a preview of their new file system, Apple File System (APFS), which is designed to provide better support for solid state drives and implements software tools such as trim, snapshots, and cloning as well as overcoming many of the limitations of HFS+. While not an exciting visible change, this backend change will allow for new features and hopefully better performance to enable a better user experience.  

Bluetooth Headphones: Are you ready to go wireless?

The time has finally come, and Apple has removed the 3.5mm jack from it’s newest line of iPhones entirely. While this will lead to a new generation of lighting connector based headphones, it will also considerably increase the popularity of bluetooth headphones. Like the electric car and alternative forms of energy, bluetooth headphones are something that everyone’s going to have to accept eventually, but that’s not such a bad thing. Over the past few years bluetooth headphones have gotten cheaper, better sounding, and all around more feasible for the average consumer. With the advent of Bluetooth 4.2, the capacity is there for high-fidelity audio streaming. Think about it: as college students we spent a lot of our time walking around (especially on our 1,463 acre campus). Nothing is more annoying than having your headphone cable caught on clothing, creating cable noise, or getting disconnected all together. There are many different form factors of bluetooth headphones to fit any lifestyle and price point. Here are a few choices for a variety of users.

Are you an athlete? Consider the Jaybird Bluebuds X

bluThese around-the-neck IEMs provide incredibly sound quality, and have supports to stay in your ears wether you’re biking, running, or working out. Workout getting too intense and you’re worried about your headphones? Don’t sweat it! The Bluebuds are totally water-proof, with a lifetime warranty if anything does happen.
.
.
.
.


Looking for portable Bluetooth on a budget? The Photive BTH3 is for you

photiveWell reviewed online, these $45 headphones provide a comfortable fit and a surprising sound signature. It’s tough to find good wired headphones for that price, yet the BTH3s sound great with the added bonus of wireless connectivity and handsfree calling. When you’re not using them, they can fold flat and fit into an included hard case to be put into your bag safely.
.
.


.High performance import at a middle of the road price.
s700Full disclosure: These are my headphone of choice. At double the price of the previous option and around 1/4th the price of the Beats Studio wireless, we find these over-ear bluetooth headphones from the makers of the famous ATH-M50. With a light build, comfortable ear cups and amazing sound quality, these headphones take the cake for price-performance in the ~$100 range.
.


Have more money than you know what to do with? Have I got an option for you.

vmoda What you see here are the V-MODA Crossfade Wireless headphones, and they come in at a wallet squeezing $300 MSRP. With the beautiful industrial design and military-grade materials, it’s an easy choice over the more popular Apple wireless headphone offerings. Like other headphones in the V-MODA line, these headphones are bass-oriented, but the overall sound signature is great for on the go listening.

How to Build an Electric Longboard

My name is Kirs. Maybe you’ve seen me on campus. I’m the guy who rides around on an electric longboard. I’m usually wearing sunglasses and earbuds and sometimes even a helmet, but what makes me stick out of the crowd is when I zip past you at 25mph with a 3.5 horse power motor buzzing under my feet. The culprit of these sounds is an electric longboard that I had built over the past summer. Building an electric longboard was one of the best decisions of my life. However, electric longboards are new pieces of technology so I’m writing this to clear up some of the misconceptions and maybe teach you how you can build your very own electric longboard.

20160918_174513

Why get an Electric Longboard?

So an electric longboard or “e board” is just another type of electric vehicle, there is no gas engine; the energy comes from a battery.The amount of energy required to charge my board is just pennies for a full charge and can last me for miles. Not to mention that I am reducing my carbon footprint by not being reliant on gas. You might be asking, why not just use a normal longboard or skateboard? The answer is simple, an e board can take you further and faster then a traditional board could ever take you. Riding an e board is it’s own type of fun, and is honestly more enjoyable then tirelessly kicking around campus on a traditional skateboard. It feels like you are always going downhill.

So an e board can transport you for miles, at speeds as up to 30mph, and doesn’t use gas; that sounds a lot like a bicycle. I’d have to say that e boards and bikes are very similar. They fill similar transportation niches and cost very comparably with prices ranging from $300 all the way to $2000 and up. I personally prefer the e board, obviously, but that’s just my opinion. They both have their pros and cons and I highly recommend people to consider the options and make a decision dependent on your needs.

Bikes are safer, better at rougher road conditions and never runs out of battery life. E boards don’t require any of that peddling nonsense, allow you to go as fast or as slow as necessary, and don’t need to be locked up every time you go inside a building. The biggest drawback with an e board is that you have to carry it everywhere you go, but that also means that you can easily switch from pedestrian to “cyclist” which lets me zip around campus very quickly. There have been many times I have stepped out of my dorm on Orchard Hill, zipped to my class in Herter Hall, and ran in, all in under 3 minutes. I don’t have to walk to the bike racks behind my building and unlock and lock my bike every time I go inside. My vehicle is at my side at the ready. Many times I’ll zip from classes, dining halls, and dorms but will find myself in situations where I need to walk. Let’s say I’m getting lunch with my girlfriend and we are headed to Hamp dining hall, I’d just pick up my e board and walk along next to her. Bikes are great at getting from one place to another, but electric longboards make every journey an adventure.

How to build your very own E-Board

So I have you convinced that you need an electric longboard (Or you just skipped to this section), and you want to know how to build one. Let me get one thing straight before you begin planning. There are two things that you need to keep in mind, effort and money. The more work you decide to put into the board the more you can customize it and also reduce costs. You could go as far as to CNC your own aluminum mounts and carve your own deck out of maple tree you cut down yourself, or you could go the zero effort route and just buy a Boosted Board V2 for $1600 and call it a day. I did something in the middle. For my build I didn’t require anything more then a power drill, some allen keys, and a soldering iron. It can vary a lot depending on what resources you have. So keep these things in mind when planning your build.

I had no experience in motors, electrical wiring, batteries, or anything ‘DIY’ before this build. I was just some freshman with a highschool level of physics under the belt and a problem that needed fixing.You don’t need to be an electrical engineer to build an electric longboard, you just need some passion, time, and money. So I’m just going to spit out the specs of my board, but I’ll explain it after so don’t get worried if you don’t understand.

  • Board – Beercan Boards 40″ Kegger DTP
  • Motor – DIY 6355 (230kv, 2650W, 12S, 80A)
  • Battery Pack Cells – 10s1p 8,000mAh LiPo
  • ESC – VESC
  • Wheels – 83mm Wheels (Black)
  • Drive Pulley – 13T Motor Pulley
  • Wheel Pulley – 36T Drive Wheel Pulley
  • Belt – 255mm High Torque Timing Belt
  • Controller – WiiMote

So my board costs about $900 which is middle of the road in terms of price. If you just want a campus cruiser to get you to class then you could easily get a build done for $400. Be aware that going on the cheap in terms of certain parts can really hurt your performance, range, and reliability of your E board. Let me give a breakdown on all the parts and what the importance of each one is.

Board – The actual deck is very important. This is the part that you will be strapping all your electronics to. You can buy a new board or use a board you already have. You don’t need a weird aluminum deck like mine, wood will suffice. I recommend longboard decks, boards of length of about 36″ or more, over normal skateboard decks because it allows for more space for mounting things underneath. You can get pretty creative with this part and find a board that’s unique.

Motor –  Motors come in all shapes and sizes so you should checkout a hobby shop or hobby parts website. This is where things get all “engineer-y”, it’s not that bad though. What you are looking for is a brush-less out-runner motor with a kv rating from 170 to 245 and watts between 1500 to 3000. So think of your kv rating as how much toque your board will have, the lower the kv the higher the torque. Watts is how much power your motor will have, more watts is generally always better. My 230kv 2500watt motor is pretty beefy and is more then enough for a single drive build.

Battery – The battery determines how far you can go. You will want a battery that is compatible with your motor. My battery is 10s1P LiPo which means I have 10 LiPo cells in series with 1 parallel line. That means the voltage of my battery is (10 x 4.2) 42volts. My motor is rated for 12s or (12 x 4.2) 50.4volts and I am running 10s, so that’s all kosher, don’t have your battery voltage exceed your motor max voltage. The capacity of the battery is measured in mAh and that determines how much juice your battery will have. I have 8,0000 mAh and with this you can determine how much energy you have in watt hours.

The formula is (current)*(voltage)/1000 = (Energy) or (mAh)*(V)/1000 = (Wh). So for me that is…

(8000)*(42)/1000 = 336wh

So for every wh of your battery you tend to get 1km of distance on your board. As you can see, my board goes very far.

Here is a great Guide to Understanding LiPo Batteries which I highly recommend you read. LiPo is a great but can also become dangerous so it is very important that you follow all the safety protocols.

ESC –  The ESC is the electronic speed controller and is the brains of your build. It connected to your controller and the battery and determines how much juice to give the motor. The one I use, and recommend, is the VESC which stands for Vedder’s ESC. Some guy in Scandinavia designed the VESC as an ESC for e boards and has become the industry standard for e boards. Just get this one.

Here is a walk-through of what the VESC is and how to use it.

Wheels, Pulleys, and Belts – So your wheels, drive pulley, wheel pulley, and belt all have to fit in together into what is referred to as a drive train. The ratio of the wheel pulley to drive pulley is called the “gear reduction ratio”. You want that to be around 2.5, but can go as low as 1.5 or as high as 3. Generally a lower reduction ratio is better. Bigger wheels mean more clearance, faster top speed, and more stability, however they also mean less torque and acceleration.

Read through this Guide to Building Your Own Drive-Train to learn more.

Controller – This is the device that you hold in your hand to control the ESC which controls the motor. I used a Wii Nyko Nunchuck but it is more common to use an RC controller because it is more reliable and does not require a soldering iron (Up until this point you haven’t needed one). I would not recommend anyone to get the Wii Nunchuck for fear of it not being set up correctly and possibly getting you hurt.

Once you have order all your parts ordered then all you have to do is assemble everything. You will still need little bits and bobs for your build. Things like screws, wires, and glue. This is just an intro to get you started and I really recommend learning as much as you can about e boards before diving in.

electric-skateboard.builders and endless-sphere.com are great forums about the e boards and has a great community full of people who can answer any questions you have.

Here is a list of websites where you can buy parts.

Building an electric longboard is not that difficult and can be a fantastic way to learn about motors, electricity, and batteries while also creating a fantastic transportation device. I hope to see more on the roads.