Fresh content on a website serves multiple purposes, including keeping you in the search engine loop. Updating your site is just as important to search engines as it is to your visitors. Search engines generally give high praises to websites that offer new information and sources for search requests. However, being desirable to search engines is not the only reason to update your site frequently.
We often find it hard to get fresh content for our website. We have to go through tonnes of websites, blog articles and much more to get a decent amount of content to share on our website. Continue reading “Using IBM Watson’s Discovery Service to get fresh content”
We daily come across in need of various Linux + ubuntu + centos commands while updating our server and it becomes hard to remember each and every command by tips. So i thought of adding all commands at one place.
- Linux version :
- Check if linux is 32 Bit or 64 Bit :
- Connect to mysql using command line :
mysql -u username -pPassword
- Backup mysql table in a file on server:
mysql -u username -p --database=your_dbname --host=your_hostname --port=3306 --batch -e "select * from table_name" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > your_backup_filename
- Current disk usage:
- Memory usage:
- Delete folder and files in it:
rm -rf folder-name
- Check log activity:
tail -f *path-to-log*. Eg: tail -f /var/log/apache2/error.log
- Enable any module:
sudo a2enmod *module-name*. Eg: sudo a2enmod rewrite
- Zip file:
sudo zip -r file.zip folder/
- To list the largest directories from the current directory in human readable format:
du -sh * | sort -hr | head -n10
- To Recursively list all files in a directory including files in symlink directories:
- Create Swap memory in linux:
- Command 1:
sudo dd if=/dev/zero of=/swapfile bs=1G count=4
- Command 2:
sudo chmod 600 /swapfile
- Command 3:
sudo mkswap /swapfile
- Unzip file:If the
unzip command isn’t already installed on your system, then run:
- Install unzip:
sudo apt-get install unzip
- After installing the unzip utility, if you want to extract to a particular destination folder, you can use:
unzip file.zip -d destination_folder
I recently had a situation where i had to alter huge tables(having >3 Million records). Normally we go with standard process of altering the table, you go to SQL editor, type in your SQL command, execute it and table is altered. But when it comes to huge tables, you may want to alter your approach.
There are 2 ways to alter big table:
- Create a new table(copy of original table), make necessary changes to schema, stop the original table to store new data, copy the original table data to new table, rename tables. And we are done.
Note : when you have table constantly changing then going with this option is not ideal. It will result in data loss and inconvenience for the users.
- Use Percona toolkit
Percona is a reliable tool to alter your tables without any data loss and minimum downtime(almost 0 minutes). Percona doesn’t stop alter, modify options while running, and takes into account the changes done while it is performing the operation. Continue reading “Using percona with AWS RDS – Editing Big DB tables”