Ubuntu apt-get update GPG error

Ubuntu apt-get update error :
W: An error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: http://storage.googleapis.com/bazel-apt stable InRelease: The following signatures were invalid: KEYEXPIRED 1527185977 KEYEXPIRED 1527185977 KEYEXPIRED 1527185977 W: Failed to fetch http://storage.googleapis.com/bazel-apt/dists/stable/InRelease The following signatures were invalid: KEYEXPIRED 1527185977 KEYEXPIRED 1527185977 KEYEXPIRED 1527185977 W: Some index files failed to download. They have been ignored, or old ones used instead.

curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add -
# Followed by sudo apt-get update

Visualizing using D3 Framework Library

Visualization using D3 JS Library

Rugby on Rails Basic Commands

rails generate controller Controllername function

Amazon S3 Bucket Config (Bucket Policy)

{ "Version": "2012-10-17", "Statement": [ { "Sid": "AddPerm", "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::(address)/*" } ] }

for example: hello_world.s3.amazonaws.com

Using Wget to download website HTML

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains ahmad.works --no-parent https://ahmad.works/writing/ This command downloads the Web site www.website.org/tutorials/html/. The options are: --recursive: download the entire Web site. --domains website.org: don't follow links outside website.org. --no-parent: don't follow links outside the directory tutorials/html/. --page-requisites: get all the elements that compose the page (images, CSS and so on). --html-extension: save files with the .html extension. --convert-links: convert links so that they work locally, off-line. --restrict-file-names=windows: modify filenames so that they will work in Windows as well. --no-clobber: don't overwrite any existing files (used in case the download is interrupted and resumed).

Oracle Virtualbox Setup Share Folder

sudo mount -t vboxsf -o uid=$UID,gid=$(id -g) name /mnt/{name}

Import and Export MySQL Database via CLI

Export MySQL table command. We must use mysqldump program. This needs to be installed separately

mysqldump --host=" " -u " " -p "table_name" > "filename".sql

Importing sql file

mysql --host=" " -u " " -p "table_name" < "filename".sql

Importing multiple sql file

for SQL in *.sql; do mysql --host="" --port="" -u "" --password="" {database_name} < $SQL; done

Apache2 PHP Minimum Config Settings

sudo vim /etc/php5/apache2/php.ini post - post_max_size = 35M upload - upload_tmp_dir = /var/tmp/ - upload_max_filesize = 100M - max_file_uploads = 100 timezone - date.timezone = "Asia/Seoul" session session.save_handler = memcache session.save_path = "cfg.apn2.cache.amazonaws.com:11211” memory_limit = 1024M sudo apt-get install php5-mysqlnd sudo apt-get install php5-mysqlnd-ms sudo vim /etc/php5/apache2/conf.d/20-mysqlnd_ms.ini mysqlnd_ms.enable=1 mysqlnd_ms.config_file=/var/www/database/system/mysqlnd_ms.json

Link node command to nodejs

ln -s /usr/bin/nodejs /usr/bin/node

Apache2 Prefork Config

Apache2 based server using prefork threads . Change setting for best performance

<IfModule mpm_prefork_module> ServerLimit 1024 StartServers 1024 MinSpareServers 256 MaxSpareServers 256 MaxRequestWorkers 1024 MaxConnectionsPerChild 0 </IfModule>

Show All PHP Errors

show all php error inside php code

ini_set('display_errors', 1); ini_set('display_startup_errors', 1); error_reporting(E_ALL);