T : +(603) 7806 3550   |   F : +(603) 7806 5586
everworks it expertise

Cloud Computing

Offering you the best

everworks services

Data Backup

Data backup solution

everworks infrastructure

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

everworks cloudster

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

everworks databackup

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Thursday, 8 May 2014

Incapsula expanding from 16 to 30 POPs by the end of 2014

Today, we received this exciting news from our DDoS partner, Incapsula in sharing their network expansion plan for 2014. With the first quarter behind , and with three new data centers already in place. Incapsula are excited to announce their commitment to doubling it's network size - expanding from 16 to 30 Points of Presence (POPs) by the end of the year.

With these new data centers, we expect Incapsula’s overall network capacity to surpass 1.5 Tbps. Plus, each of Incapsula’s POPs will be upgraded to cache 30 times more than its current capacity by adding new border switches with increased port density.

Where is the New Data Center? 

Western Hemisphere
  1. Atlanta, USA
  2. Lima, Peru
  3. Madrid, Spain
  4. Mexico City, Mexico
  5. Sao Paulo, Brazil
  6. Toronto, Canada
Eastern Hemisphere

  1. Auckland, New Zealand
  2. Hong Kong, China
  3. Milan, Italy
  4. Mumbai, India
  5. Seoul, South Korea
  6. Vienna, Austria
  7. Warsaw, Poland

Below are the some latest updates from Incapsula about their DDoS capacities. We, at EVERWORKS.com have been engaging to Incapsula's CDN and DDoS protection solution for our Indonesia, Malaysia, Thailand and Vietnam clients. Our clients are so far all satisfied with the speed and protection feature given by Incapsula for their online businesses.

Staying Ahead of DDoS Attacks
Our rapid network expansion is not only fueled by Incapsula’s own growth but also by the growth of volumetric DDoS attacks. Since the beginning of 2013, we’ve witnessed an unprecedented increase in the number of large-scale DDoS threats to the point where one in every three attacks now reaches beyond 20Gbps and +100Gbps DDoS threats are no longer uncommon.

With several large-scale DDoS events occurring on a daily basis, and with more and more websites looking to Incapsula for protection, we knew it was time to invest; not just for the need of today, but also for the goals of tomorrow.

And so, as big as today’s announcement is for us, it’s still yet another piece of a much bigger picture, with the next big DDoS related update already around the corner.

Saturday, 28 September 2013

Allow or Deny Search Engines by Using Robots.txt File

What is Robot.txt? 
Web site owners use the /robots.txt file to give instructions about a website to web robots, and this is called The Robots Exclusion Protocol. Website administrator is using robots.txt File to Allow or Deny Search Engines. If you have portions of a website that you do not wish for search indexes to see, you can protect them with a “robots.txt” file dictating which search engines are allowed or disallowed from seeing specific folders/files.

There are many options which you can specify in a robots.txt file to explicitly deny or allow specific search-bots to index certain folders or files. The simplest robots.txt file uses two rules:

  1. User-agent: the robot the following rule applies to
  2. Disallow: the URL you want to block

    These two lines are considered a single entry in the file. You can include as many entries as you want. You can include multiple Disallow lines and multiple user-agents in one entry. 

Two important considerations when using /robots.txt:

  1. Robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
  2. the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use. So don't try to use /robots.txt to hide information.

Can I block the Bad Robots?
In theory YES, in practice, NO. If the bad robot obeys /robots.txt, and you know the name it scans for in the User-Agent field. then you can create a section in your /robotst.txt to exclude it specifically. But almost all bad robots ignore /robots.txt, making that pointless.



DDoS Attack via Bad Robots
If the bad robot operates from a single IP address, you can block its access to your web server through server configuration or with a network firewall. In DDoS situation, the robot operate at lots of different IP addresses (hijacked PCs that are part of a large Botnet), to generate attack load to your server by just simply scanning your website. It will make the entire web server slow or stop with little of bandwidth involved. It's consider as level 7 of DDoS attack, the application attack.

The easiest solution is to use an advanced firewall to automatically block on all these IP addresses that make many connections. To learn more about our DDoS protection solution, please visit http://www.everworks.com/Services/DDoS

Please find below for some other articles that explains how a robots.txt file works, as well as how to and how to configure it for your website.

Saturday, 1 June 2013

Another 167Gbps of DDoS Attack using DNS Reflection method

Still Remember the recent DDoS attack on Spamhaus at 300Gbps? This Recorded ad the largest ever DDoS attack so far. On 27th May, another similar method of DDoS attack (DNS Reflection Method) has launched toward an unnamed financial firm. The attack has reached up to 167Gbps. Prolexic has announced that they have successfully mitigated for their client and distributed it across to their 4 cloud based scrubbing center in Hong Kong, London, San Jose and Ashburn. According to Prolexic, scrubbing center at London has mitigated majority of the attacks, around 90Gbps. Although smaller than the Spamhaus assault, it still registered as the largest ever defended by Prolexic in its 10-year history.