General Gaming Article

General Gaming Article


Snaptracs Tagg: The Pet Tracker Review

Posted: 06 Jan 2012 07:28 PM PST

A great idea that might be just a little ahead of its time 

For those of us with pets, the animal is nearly as much a part of the family as any human. Losing that pet—whether it runs away, becomes lost, or is stolen—can be as tragic as losing any other member of the family. 

Implanting a microchip in your pet might help you recover it, but only if the animal shows up at a facility—such as the pound or the Humane Society—that's equipped with a scanner. Snaptracs, a division of the mobile-technology behemoth Qualcomm—promises a much better solution: A $100 GPS device that attaches to the pet's collar, so you can instantly locate your pet anywhere on the planet (there's also a $8 per-month subscription fee after the first month). You can add up to nine additional pets to the subscription plan for $1 per month, plus the cost of each Tagg tracker. 

Tagg: The Pet Tracker consists of battery-powered GPS tracker that attaches to your pet's collar, and an AC-powered base station. 

Tagg: The Pet Tracker represents a number of impressive technology achievements; unfortunately, it also has a few limitations that can render it practically useless. We'll explain how the device works, and then we'll discuss our real-world experience with it. 

The Tagg hardware is, for all intents and purposes, a cell phone stripped of a mic, speaker, and keyboard. The Tagg tracker is incredibly small and lightweight (just 1.5 ounces, including the collar clip), and it's relatively rugged (as it must be to fit on a dog or cat's collar and survive the abuse that an animal and the elements will dish out). The Tagg tracker is rated water resistant in up to three feet of water for up to 30 minutes. The antennas are hidden in pair of curved, flexible wings that follow the contour of the pet's neck. 

Once you've charged the Tagg tracker's battery on its AC-powered base station, you clip the tracker to your pet's collar. The Tagg tracker or the base station will then push status reports over Verizon's cellular network via SMS and/or email. Once you log into your Tagg account on the Web from a PC or your smartphone, you can also ping the tracker and it will report the pet's approximate location with an icon superimposed on a satellite map.

 

Tagg will send you an alert when your pet moves outside the 75-yard radius of its home zone (the area in blue). 

In order to preserve the Tagg's battery life, the device goes into an extreme low-power state whenever it's within wireless range of the base station, and the bulk of the cellular-network communications occur on the base station during this time. As soon as the Tagg tracker can no longer detect the base-station's signal beacon—when the animal moves outside a 75-yard radius of the base station—the Tagg's cellular chip switches to full-power mode, establishes a direct link to the network, and sends an alert that the pet has moved outside the designated Home Tagg zone. You can increase the size of the Home Tagg zone, but not decrease it. In order to reduce false alarms when you take your dog on a walk, you can push a button on the Tagg tracker and put it into "trip" mode. This enables you to move the pet any distance outside the Home Tagg zone without producing an alert. The Tagg tracker's LED flashes blue in this state, and it will reset itself as soon as it comes back within range of the base station. 

The system is capable of sending a number of other messages, too. It will report when the animal is within range of the base station, when the Tagg's battery is low, when it's fully recharged, and even when the Tagg becomes detached from your pet's collar. You can also activate a tracking mode from the website that will automatically locate your pet every three minutes for up to 30 minutes. Like we said, the technology is very impressive. How it works in the real world is a whole other matter, which we'll tackle now. 

Real-world Testing 

 Okay, I'll drop the royal "we" here, because the rest of this review is based on my personal experience with my dog (a full-grown, 80-pound mutt named Dixie) and my daughter's family's dog (a nine-month-old, 50-pound Catahoula). Both dogs live on the same 10-acre parcel of land in northern California. Dixie is an outside-only dog; my daughter spoils Sally by letting her sleep inside the house at night. The two dogs are fast friends who love to play together, but Dixie has a bad habit of jumping our barbed-wire fence to roam the neighborhood. 

As you can see from the screenshots on this page, the Home Tagg zone covers a lot of territory; a circle roughly two acres in diameter. Dave Vigil, Snaptracs' president, tells me they could make the Home Tagg zone smaller, but they won't because they want the Tagg to remain connected to the base station as much as possible. This preserves battery life, and it reduces the number of alerts you receive. As Vigil explains it, "My neighbors know my dog, so I'm not too concerned if he gets into their yard." I'm not so sanguine about Dixie wandering off my property; because if she gets into my neighbor's yard, she might eat his chickens. That's a life-threatening menu for Dixie, and it's not because she's allergic to chicken. 

The other problem I encounter is that Dixie spends most of her time with Sally at my daughter's home, which is on the opposite side of our property (Snaptracs' satellite maps are old, because my four-year-old home doesn't show up on them). That means she's nearly always outside her Home Tagg zone, and I'm barraged with alerts. It also kills the Tagg's battery life. Where Snaptracs claims the battery should last as long as 30 days, I'm lucky if Dixie's lasts seven days. I partially solved this issue by moving the base station to my daughter's house, but now I get alerts whenever Dixie is home. 

But the biggest problem I have with Tagg: The Pet Tracker is that the Tagg has repeatedly fallen off Dixie's collar.  Snaptracs sent a second Tagg because the company thought the original might have a manufacturing defect, so I took the liberty of putting it on Sally's collar so I could track both dogs. I've had the same problem with this second unit. I've never seen them fall off, but because I usually find them in stiff brush, under low tree branches, or by the fence, I suspect those wing-like antennas are getting snagged and yanked off the dogs' collars. The last time it fell off Dixie's collar, I'm pretty sure it was because Dixie and Sally were roughhousing, because I found the partially chewed tag buried in the dirt (see photo below).

 

Sally mistook Dixie's Tagg tracker for a chew toy. 

I'm away from home a lot, and my dog's escape antics are a real problem, so I really hoped this product would work. But as impressed as I am by what Snaptracs has achieved, I just can't recommend buying the company's device and service. Your mileage may vary—especially if you have a more sedate pet—but in my experience, Tagg the Pet Tracker would be more aptly named Tagg: The Tagg Tracker, because I used it far more often to locate the detached GPS device than I ever did my wayward mutt. 

 

 

Netflix App Arrives in UK PlayStation Store, Netflix Access Still Missing

Posted: 06 Jan 2012 02:13 PM PST

netflixCongratulations UK PS3 owners. You've got Netflix! Well, a Netflix app, at least. Did we mention there is still no Netflix service in the UK? You were probably aware of that, but the appearance of the app in the PlayStation Store should offer some hope that the service is really and truly going to arrive soon.

The app can be installed, but region-locking prevents it from running when launched. Instead users will get a message that reminds them Netflix is launching soon. The app will take your email to notify you when there is a date. For now, that inviting red icon is just going to sit there, mocking you from the XMB interface. 

Netflix has been preparing a UK launch to compete with Amazon-owned LOVEFiLM for the last few months. The service has already expanded to Canada and Latin America. 

Major Media Outlets Ignore SOPA, Support Passage

Posted: 06 Jan 2012 01:59 PM PST

sopaQuite a few technology enthusiasts have noted the almost complete lack of airtime SOPA and Protect IP have gotten on media outlets as the debate continues to wind through Congress. A new report sheds a bit of light on the topic pointing out that most media companies are on the record supporting the legislation. 

The MediaMatters report found that among big names like MSNBC, Fox News, ABC, CBS, and NBC there was not a single mention of SOPA or Protect IP during the evening news broadcasts. CNN was good enough to talk about SOPA once in the last few months. Technology experts and observers alike fear that these bills could damage the fundamental structure of the Internet, and hand too much power to censor content to copyright holders.

ABC and CBS are listed as official supporters of the bill, while the likes of Time Warner (CNN) News Corp (Fox), and Comcast (NBC) have simply spoken in favor of it. Most technology companies like Google, Facebook, and Twitter are strongly opposed to the bill for fear it would harm the openness of the Internet. 

Spotify's Free Music Deal Ending Next Week

Posted: 06 Jan 2012 01:35 PM PST

spotifyWhen Spotify arrived in the U.S., there was such fanfare that one part of the rollout plan was largely ignored. That free Spotify playback on the desktop enjoyed by so many users was only set to last for six months, and next week is Spotify's six-month anniversary in the U.S. market. When that sweetheart licencing arrangement is up, free Spotify accounts are going to be much more locked down.

The current unlimited playback with ads is going to be reduced to a maximum of 10 hours per month. Users will also only be allowed to queue up a single track five times in a month. Anyone that wants to keep listening to unlimited tunes will have to upgrade to one of the paid accounts. $5 a month gets you unlimited music on the desktop, and $10 per-month is required for mobile access to Spotify. 

Spotify has experienced scrutiny for its small payouts to independent artists, Facebook tie-in, and licensing deals. It's unclear if the free listeners will be willing to pay for access to something they got for free these last six months.

Is Your Gaming Laptop's RAM Slowing It Down?

Posted: 06 Jan 2012 12:47 PM PST

We look at the effect of memory bandwidth and clockspeed on gaming performance.

The mystique of adding RAM to a system to "increase performance" is often misunderstood by the average person. Most think that if their seven-year-old Windows XP build is getting slow, doubling the RAM from 2GB to 4GB will speed it up. Any PC tech worth his Pringles knows that won't do much for Windows XP performance. Generally, it's very easy to hit the point of diminishing returns with system RAM.  But there's one bad pattern we've been seeing in many of the notebooks with integrated graphics lately: configuring RAM for the minimum system bandwidth.

If you're a browser jockey, that's not a huge issue but if you play any games that rely on the graphics card, that configuration can hobble your performance if you're trying to play games. To see what the situation is, we decided to take a typical modern notebook and see the impact of system bandwidth on gaming. Read on.

honestly frank

A short history of integrated graphics

The issue at hand is how integrated graphics accesses RAM vs. a traditional discrete card. A discrete card in a notebook has its own dedicated pool of RAM. Besides offering far higher data rates from using GDDR5, a discrete GPU's RAM usually runs at much higher speeds since they are soldered directly to the board the GPU rides on and the wires run straight to a dedicated, very wide, high-speed memory controller. While the width of the memory controller and the speed of the RAM has changed, discrete GPUs have pretty much been the same.

Integrated graphics, on the other hand, have changed quite a bit since introduction. Initially "integrated graphics" meant the graphics core was a discrete chip and RAM was soldered to the motherboard and connected to the CPU via PCI. This eventually moved directly into the core logic chipset itself with chipsets from SiS, VIA and Intel's 810 "Whitney." Instead of relying on RAM on the motherboard, core logic chipset-based graphics mostly use main system memory which is far cheaper to implement. We say mostly, because both AMD and Nvidia have tried to ameliorate memory bandwidth and size issues by adding internal cache to the integrated graphics component. Those solutions have mostly been outside the mainstream though. Integrated graphics has always been about making it as cheap as possible.

It doesn't get any cheaper than integrating it directly into the CPU itself. This theoretically lowers the cost of the chipset, the overall cost of the system and conserves power too. Intel's Clarkdale/Clarksfield and Sandy Bridge CPUs and so do AMD's Llano and Brazos APUs. Despite this technology step forward though, integrated graphics still suffer greatly in one area: memory bandwidth. A dual-channel DDR3/1333 setup, for example, offers a theoretical bandwidth of 21.3GB/s. Compare this to a stock clocked GeForce GTX 560 Ti which has 128GB/s of bandwidth on tap and the top-end GeForce GTX 580 which takes it 192.4GB/s. Mobile GPUs don't offer quite  the same amount of bandwidth but the  GeForce GTX 580M mobile part is moving along 96GB/s. It's not always the case, but generally discrete parts offer boatloads more memory bandwidth.

Memory bandwidth isn't everything in the graphics equation but it does matter quite a bit. So when we started seeing integrated notebooks with two memory slots and only one of those populated we scratched our head and wondered how much it hurt performance.

To find out, we took a Toshiba Portege R830 which was equipped with two SO-DIMM slots but only one Samsung 4GB SO-DIMM DDR3/1333 module running in single-channel mode. Making our test even more interesting, the notebook oddly was running 32-bit Windows 7 Professional so it couldn't even address more than 3.5GB anyway. We ran the Portege in three different configurations. The first was the stock 4GB of single-channel DDR3/1333. The second was with a standard Corsair DDR3/1333 kit of two 4GB SO-DIMMS in dual-channel mode. The third was a new take on modules hitting notebooks: overclocked modules. Unlike performance desktops that give you control over what frequency you want your RAM to run at, the vast majority of notebooks have no such BIOS control – they rely solely on what the SPD or serial presence detect chip on the memory to set the speed. Overclocked RAM, such as the Kingston HyperX DDR3/1866 modules we used for our test, tells Sandy Bridge-based notebooks to run the RAM at DDR3/1866 even if you have no way to set it in the BIOS (the Portege, for example, did not).

For our test, we reached into the dust bin for several older benchmarks including Quake III, Quake IV and 3D Mark 2006. We also used some newer benchmarks such as Resident Evil 5 and Dirt 2. To see the actual theoretical memory bandwidth, we ran Sisoft Sandra 2012 as well.

 

The upshot:

We saw worthwhile performance increases going from single-channel DDR3 to dual-channel DDR3. We have to reiterate that even though there is a memory size difference here, it has minimal impact since we are running 32-bit Windows 7. The extra RAM adds nothing, it's really about the memory bandwidth.

Far more dated 3D workloads, where the barrier isn't the actual performance chip itself, we saw very significant performance gains of 29 percent. Going to DDR3/1866 saw that go to 38 percent when the two are compared. As we move from Quake III to Quake IV the frame rates from the feeble integrated graphics plummet but the performance spread from adding bandwidth is about the same.

With more of a graphics load from 3DMark 2006, we saw the spread drop a bit but still maintain a healthy 21 percent and 33 percent difference from adding more bandwidth. That's not bad, but this is 2012.

But once you get to something far more modern such as 2009's Dirt 2, the performance impact from single-channel to dual-channel closes up to about 3.6 percent. We didn't expect it but moving to the DDR3/1866 modules gave the game a pretty substantial bump of about 18 percent. That really isn't bad, but certainly not magical. You're basically looking at 35 fps vs.  42 fps with a more modern workload. It just reiterates that you can't magically make an integrated graphics part twice as fast by adding more memory bandwidth when running modern workloads.

Again, it's very much about what is holding you back, the graphics core or the memory bandwidth. To illustrate our point, we ran 2009's Resident Evil 5 at an Xbox "HD" resolution of 1280x720 in DX9 mode with the textures set to high. With the integrated Intel "HD" Graphics 3000 core in the 2.7GHz Core i7-2620M, it isn't hard to swamp it. We still see about 12.5 percent more frames with the dual-channel configuration and 24.2 percent bump running the DDR3/1866 modules. That's 27 fps in single channel vs. 34 fps in dual-channel. With a few tweaks though, we can get the frame rates up. Running at 1024x768 with the texture level set to low, we see the frame rates pop up nicely and a 26.4 percent bump going from single-channel to dual-channel and the overclocked RAM giving us a very nice 37.7 percent increase or 40 fps in single channel vs. 58 fps with the DDR3/1866 modules. Since you'll likely have to crank down the image quality levels anyway, that frame rate bump can help in gaming.

What about system bandwidth?

To find out, we ran the synthetic memory bandwidth test in SiSoft Sandra 2012. Dual channel gave us – no surprise – nearly a 100 percent increase over single-channel. Those hot DDR3/1866 modules opened it up to a 167 percent increase in available memory bandwidth.

In the final analysis, we think it's well worth running your new laptop in dual-channel mode if you are chasing 3D performance.  The cost of the Corsair kit is essentially a steal today with an 8GB SO-DIMM kit (you would only need one module if your notebook already has a single SO-DIMM in it so cut the price even more) at $35 after rebate. For the overclocked RAM you'll have to think a bit harder. The Kingston Hyper X kit we used fetches about $120 online. You'll have to justify its use at that price but you will definitely see a frame rate advantage from it. How much depends on the graphics load. But then again, maybe it would have been a better idea to get a notebook with a discrete graphics part in it. But if you can't and you're unsatisfied with the gaming performance increasing the memory bandwidth is definitely a route worth exploring.

Benchmarks
4GB 1x4 DDR3/1333 8GB 2x4 DDR3/1333 8GB 2x4 DDR3/1866
Memory Mode Single Channel Dual Channel Dual Channel
Quake III "High-Quality" 231 298 320
Quake IV "High-Quality" 40.1 50.8 57.6
3DMark 2006 3,819 4,648 5,083
Dirt 2, 10x7, Ultra Low 35.5 36.6 41.8
Resident Evil 5, 12x7, DX9, AA Off, Motion Blur Off, High Textures, Variable Benchmark 27.3

30.7

33.9
Resident Evil 5, 10x7, DX9, AA Off, Motion Blur Off, Shadow, Texture and Overall set to Low, Variable Benchmark 43.5 55.0 59.9
SiSoft Sandra 9.2GB/s 18.1GB/s 24.6GB/s

Head to Head: Facebook vs. Google+

Posted: 06 Jan 2012 12:04 PM PST

A metaphorical boxing match between two 800-pound gorillas is quickly shaping up in the social network arena. In one corner: Facebook, the reigning champion. In the other corner: Google+, a fast-rising up-and-comer with a big name and deep pockets behind it. At stake: the time-deprived attention of millions of social network users. There can be only one victor.

Round 1: User Base

The more users a social network has, the more opportunities there are for its users to get gabby. No social media network in history reached 25 million users faster than Google+, which achieved the feat in its very first month despite being invite-only. It took Facebook three years to reach that total, but since then the service has grown like gangbusters and currently claims an utterly ridiculous 750 million users. Even your grandmother probably has a Facebook account.

Winner: Facebook

Round 2: Privacy

Both services force users to sign up with their real names, a requirement we're uncomfortable with. Facebook's been plagued by privacy concerns for years now, and although changing your privacy settings is easy, its privacy options aren't as robust as Google+'s. G+ not only includes more privacy options, it also lets you choose who can see each post you make and which portions of your profile are visible to the public.

Winner: Google+

Round 3: Games

Games are a major component of Facebook: More than half of all Facebook users play games, and Facebook's game library spans approximately a gajillion titles. Games showed up late on Google+, but the dedicated games channel and the ability to post high scores are great touches. Google+'s initial games include blockbusters like Angry Birds and Dragon Age: Legends. Unfortunately, at the time of this writing, there were only 16 Google+ games available.

Winner: Facebook

Round 4: Video Chat

Both networks offer free video chat services that are incredibly easy to use. Facebook's Skype-powered video calling allows you to chat one-on-one with your friends and leave video messages if they aren't online. But it can't hold a candle to Google+'s Hangout, which supports up to 10 people in simultaneous video chat. Plus, it allows you to watch YouTube videos as a group.

Winner: Google+


Round 5: Mobile Apps

Google+'s mobile app for iPhone and Android devices gets all the basics right, but its highlight is the Huddle feature, a group-chat function similar to the old AOL chat rooms. Facebook countered the threat with its new Mobile Messenger app, which expands upon the features in the standard app. Not only is Facebook's feature set more robust, it's also available for tons of devices—and it isn't plagued by the bugs and crashes that are sometimes found on the Google+ app.

Winner: Facebook

And the Winner Is…

In three out of five rounds, Facebook triumphs over Google+. Sure, it may have some privacy concerns, and it doesn't have quite as clean a look as Google+, but when it comes down to brass tacks, Facebook's seniority shows in its deep user base and myriad options. There's a lot to like in Google's fledgling network, but Facebook just makes it easier to be social.

Hackers Nab Norton Antivirus Source Code

Posted: 06 Jan 2012 11:39 AM PST

Who watches the watchmen? Alan Moore took a long, hard look at that question in the classic Watchmen graphic novel, but today we finally got a firm answer – at least if by "watchmen" you mean "computer security companies." Symantec got the virtual equivalent of egg in the face after an Indian hacking group going by the name of "The Lords of Dharmaraja" managed to get their digital hands all over the Norton antivirus source code.

Actually, as embarrassing as it is, the theft isn't as bad as it sounds. According to The Register, Norton confirmed that the hackers indeed had a portion of source code, but from a 2006 enterprise version of the software, not anything recent or consumer-focused.

"This does not affect Symantec's Norton products for our consumer customers," Norton said in a statement to the website. No current versions of enterprise software are considered vulnerable either, and at this time, Symantec doesn't believe any customer data was stolen.

So how'd it happen? That's up in the air. All Symantec will own up to is that the breach occurred from a third party, not its own servers. Still, plenty of people must be might red-faced right about now.

EVGA GTX 560 Ti 448 FTW Review

Posted: 06 Jan 2012 11:37 AM PST

The not-quite GTX 570

When is a GTX 560 Ti not really a GTX 560 Ti? When it's almost a GTX 570.
Nvidia's latest GPU, the GTX 560 Ti 448 is really a GTX 580 (originally dubbed the GF110) with two functional blocks disabled, reducing its CUDA Core count from 512 to 448. The GTX 570 is a GF110 with one functional block disabled, endowing it with 480 CUDA Cores. The original GTX 560 Ti is a completely different chip, with different power requirements, but all 384 of its cores are fully functional.

Priced at $290, The 560 Ti 448 fills a price gap between the $250 GTX 560 Ti and the $350 GTX 570. Given that yields for GF110 GPUs have improved, the Ti 448 is a limited-edition version, so it's unclear how long it will remain on the market. And since we're approaching the end of a GPU generation, it's likely that many of the processors around today will soon ride off into the sunset. If you really want a GTX 570, but can't swing the price, the 560 Ti 448 might fill the bill. Like most of the retail cards based on this chip, EVGA's GTX 560 Ti 448 is factory overclocked, to 797MHz. Compare that to the typical GTX 570 design—Asus's ENGTX570, for instance—in which the GPU runs at a stock clock of 742MHz. We also compared EVGA's card to a couple of other factory-overclocked SKUs; namely, the Asus GTX 560 Ti DirectCU II, and the MSI Radeon HD 6950 Twin Frozr III.

The two-slot GeForce GTX 560 Ti 448 FTW is outfitted with two dual-link DVIs and one each HDMI and DisplayPort on its mounting bracket.

Note: We've made some minor changes to both our test bed and our game benchmarks; so don't compare these performance numbers to our earlier reviews. The GTX 570 pulls slightly ahead of the pack in apps that make heavy use of shader programs (that's Just Cause 2 and the Unigine Heaven 2.5 synthetic benchmark in our suite). In most other benchmarks, it's either a wash or the GTX 560 Ti 448 posts a slight lead. MSI's implementation of AMD's Radeon HD 6950 keeps up in some benchmarks (Shogun 2, STALKER: CoP, and Metro 2033), but it falls behind cards based on Nvidia's GF110 in the other tests. On the other hand, the Asus GTX 560 Ti trails the field in nearly all the benchmarks, edging out the HD 6950 in just a couple (Just Cause 2 and HAWX 2).

So this card is cheaper than the GTX 570, but it still costs nearly $300. Note also that our GTX 570 isn't a factory-overclocked version; most current versions shipping deliver higher clock speeds, so the performance gap between the EVGA GTX 560 Ti and those GTX 570 cards will likely be wider.
The GTX 560 Ti 448 is also about the same size as other GTX 560 Ti cards, which means it will fit in more compact cases. So if you're looking for a little more performance juice in that small form factor gaming rig, EVGA's Ti 448 FTW is definitely worth a look.

New Power-Saving DevSleep Feature Added To SATA Specification

Posted: 06 Jan 2012 11:18 AM PST

More power is a good thing when you're talking desktops, but for notebooks, more power means less battery life – and in this age of Ultrabooks and ultraportables, that just isn't acceptable to a lot of manufacturers. In yet another step towards making those Ultrabooks ultra long lasting, the SATA-IO organization announced a new feature yesterday: SATA DevSleep. Basically, DevSleep lets PHY and other circuitry drop into an almost completely powerless state – rather than a still power-consuming "Partial" or "Slumber" state – when it isn't being used.

"With DevSleep an ultra-thin laptop that would have previously needed to be put into standby to conserve battery life can now stay on and be immediately available for use," SATA-IO boasts in its press release.

An infographic on the SATA-IO website claims that hard drives and displays are the two most power-hungry components of any notebook, and DevSleep should help Ultrabook SSDs lower energy usage without making too noticeable of an effect on performance. The whitepaper describing the new feature lays things out in a graph (recreated above): partial and slumber modes use about 100mW of energy and "wake up" inside of 10 milliseconds, while the new DevSleep mode uses just 5mW and still wakes up in 20ms – not too shabby.

Of course, as good as it sounds, for now it's still just paper. There's no word yet on when we'll actually begin to see SATA devices with DevSleep available in the wild.

Cool Site of the Week: TweepsMap

Posted: 06 Jan 2012 11:13 AM PST

tweetmapsThere's no denying that Twitter's become an important part of our lives, bringing us a first hand view of the profane, mundane and everything in between from around the globe. By firing off a tweet, you're not just speaking your mind, you're adding to a far-reaching cultural mosaic that speaks of our thoughts, dreams, loves and hates, moment by moment. If you've ever wondered who's reading the 140 character toots you've been spewing, you'll love TweepsMap, our Cool Site of the Week.

TweepsMap is a web app that analyzes the whereabouts of your Twitter followers and then, as the name suggests, visualizes their whereabouts on a map. Using TweepsMap couldn't be easier: Just enter your Twitter credentials, tell Twitter that you're cool with allowing TweepsMap to access your account and watch as your followers pop up on a map of the world. TweepsMap shows users what percentage of their followers hail from each country of the world. These numbers can also be broken down in to regional, state, and province information, making it a cinch to find out how many people are following you in County Galway, Ireland and monitoring your movements in Maryland.

Best of all, users can switch from a map view of the statistical breakdown to a piechart to better visualize the breakdown of their follower's locations at a glance. 

 

 

Total Pageviews

statcounter

View My Stats