Tag Archives: DevOps

Remove Tencent Cloud (QCloud) Cloud Monitor

bash /usr/local/qcloud/stargate/admin/uninstall.sh
bash /usr/local/qcloud/YunJing/uninst.sh
bash /usr/local/qcloud/monitor/barad/admin/uninstall.sh

rm -rf /usr/local/sa
rm -rf /usr/local/agenttools
rm -rf /usr/local/qcloud

process=(sap100 secu-tcs-agent sgagent64 barad_agent agent agentPlugInD pvdriver )
for i in ${process[@]}
do
  for A in $(ps aux | grep $i | grep -v grep | awk '{print $2}')
  do
    kill -9 $A
  done
done

# Optional
chkconfig --level 35 postfix off
systemctl stop postfix
systemctl mask postfix

The Simplest MediaWiki Update Script for Single-Server MediaWiki Site

System requirements:

  • User uploads $wgUploadDirectory are stored offsite
  • Non-Docker MediaWiki with normal setup
  • Composor installed (Can be installed automatically during updating)

Goals:

  • Update MediaWiki with nearly zero downtime
  • Download and install latest tagged MediaWiki from tarball package
  • Update extensions and skins from latest git tagged branch
  • Install extension-specific dependencies during updating

First create a script name it mw-update.sh and make sure the script exits if anything goes wrong:

# Exit the whole script if anything goes wrong
set -e

Create a config file mw-update.conf with the following content:

version=1.35.1
tmp="/tmp"
dest_base="/srv/www/wikiroot"
dest="/public_html"
permissions="nginx.nginx"
extensions="
BetaFeatures
CheckUser
CommonsMetadata
MobileFrontend
cldr
Flow
ContributionScores
intersection
"
skins="
MinervaNeue
"

Switch back to your script. Add more variables to use later:

MWU_START=`date +%s`

# Set the current working directory to the directory of this script
cd ${0%/*}

# Check if config exists
if [ ! -f ./mw-update.conf ]; then
  echo -e "Config not found, run the following command first:"
  echo -e "\n$ cp mw-update.sample.yml mw-update.yml"
  exit 1
fi

# Parse config
. mw-update.conf

# MediaWiki uses git tag for latest stable version
MW_VER=$version

# Decoration
BLUE='\033[0;34m'
NC='\033[0m' # No Color

# Production dir
PROD_BASE="$dest_base"
PROD_DIR="$PROD_BASE$dest"

# `MW_VER_MAIN` is for the stupid download URL
MW_VER_MAIN=$(echo $MW_VER | sed -E 's/\.[0-9]+$//g')

# Temp dir prepare for update
TMP_BASE="$tmp/mediawiki-update"
TMP_DIR="$TMP_BASE/mediawiki-$MW_VER"

# Extensions dir
EXT_DIR="$TMP_DIR/extensions"
SKIN_DIR="$TMP_DIR/skins"

# Extensions use git branch for latest stable version
EXT_VER=$(echo REL$MW_VER_MAIN | sed -E 's/\./_/g')

# List of custom extensions
EXTENSIONS="$extensions"
SKINS="$skins"

Then print the setup info for confirmation:

# Prompt before executing anything
echo -e "${BLUE}MediaWiki Auto Updater${NC}"
echo -e "Tunghsiao Liu ([email protected])\n"
echo -e "     Core version: ${BLUE}$MW_VER${NC}"
echo -e "   Branch version: ${BLUE}$MW_VER_MAIN${NC}"
echo -e "Extension version: ${BLUE}$EXT_VER${NC}"
echo -e "   Temp directory: ${BLUE}$TMP_BASE${NC}"
echo -e " Destination base: ${BLUE}$PROD_BASE${NC}"
echo -e "      Destination: ${BLUE}$PROD_DIR${NC}"
echo -e "      Permissions: ${BLUE}$permissions${NC}"
echo -e "Custom extensions: ${BLUE}$EXTENSIONS${NC}"
echo -e "     Custom skins: ${BLUE}$SKINS${NC}"

Continue executing if user confirms:

read -p "Press enter to continue..."
echo -e "\n"

Check if PHP Composor installed:

function check_composer() {
  # Allow running Composer with root
  export COMPOSER_ALLOW_SUPERUSER=1

  if command -v composer >/dev/null 2>&1 ; then
    echo -e "${BLUE}PHP Composer found: $(composer --version  | head -n 1)${NC}"
  else
    echo -e "PHP Composer not found, trying to install it..."
    php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
    php composer-setup.php
    php -r "unlink('composer-setup.php');"
    echo -e "Move PHP Composer to /usr/local/bin/composer for global use"
    mv composer.phar /usr/local/bin/composer
  fi
}

Begin to update MediaWiki cores:

# Update core files
echo -e "${BLUE}Creating essential directories${NC}"
mkdir -p $TMP_BASE

echo -e "${BLUE}Updating MediaWiki core files${NC}"
cd $TMP_BASE
wget -c https://releases.wikimedia.org/mediawiki/$MW_VER_MAIN/mediawiki-$MW_VER.tar.gz
tar -zxf mediawiki-$MW_VER.tar.gz

echo -e "${BLUE}Backing up LocalSettings.php${NC}"
cd $TMP_DIR
cp $PROD_DIR/LocalSettings.php $TMP_DIR/

Update extensions:

# Update extensions
for extension in $EXTENSIONS
do
  echo -e "${BLUE}Updating extension $extension...${NC}"
  cd $EXT_DIR
  if [ ! -d "$EXT_DIR/$extension" ]; then
    echo -e "${EXT_DIR}/${extension} git repo does not exists!"
    git clone https://gerrit.wikimedia.org/r/mediawiki/extensions/$extension.git
  fi
  cd $EXT_DIR/$extension
  git reset --hard
  git clean -f -d
  git pull
  git checkout $EXT_VER
  echo ""
done

Some extensions need special setup process like update git submodules or install dependencies with Composor:

if [[ ${extensions} =~ "Flow" ]]; then
  echo -e "${BLUE}Updating composer for Flow (StructuredDiscussions)...${NC}"
  cd $EXT_DIR/Flow
  check_composer
  composer update --no-dev
  echo ""
fi

Update skins:

# Update skins
for skin in $SKINS
do
  echo -e "${BLUE}Updating skin $skin...${NC}"
  cd $SKIN_DIR
  if [ ! -d "$SKIN_DIR/$skin" ]; then
    git clone https://gerrit.wikimedia.org/r/mediawiki/skins/$skin.git
  fi
  cd $SKIN_DIR/$skin
  git reset --hard
  git clean -f -d
  git pull
  git checkout $EXT_VER
  echo ""
done

Please note that all the above process are done in $TMP_BASE directory. The website is still untouched at the moment. Now lets go into $TMP_BASE to finalize the prepare process:

cd $TMP_BASE

echo -e "${BLUE}Backing up old files...${NC}"
# tar mediawiki-$MW_VER-update-backup.tar.gz -C $PROD_DIR .
mv $PROD_DIR $PROD_BASE/mediawiki-$MW_VER-backup-$(date +%F-%H:%M)

echo -e "${BLUE}Moving updated files to production...${NC}"
cp -R $TMP_DIR $PROD_DIR

echo -e "${BLUE}Fixing directory permissions...${NC}"
chown -R $permissions $PROD_DIR

echo -e "${BLUE}Running MediaWiki maintenance...${NC}"
php $PROD_DIR/maintenance/update.php --quick

MWU_END=`date +%s`
MWU_RUNTIME=$((MWU_END - MWU_START))

echo -e "${BLUE}Done! Time took: ${MWU_RUNTIME}s${NC}"

GeoIP Bypassing for Nginx Proxy

Goal:

  • Proxy content for requests in specific country or region
  • Redirect any requests made outside specific country or region to original URL (to save bandwidth
geoip_country         /usr/share/GeoIP/GeoIPv6.dat;
map $geoip_country_code $proxy_direct_pass {
  default yes;
  CN no;
}

location ~* ^/proxied-content/(.*)$ {
  if ($proxy_direct_pass = yes) {
    return 302 https://original_content/$1$is_args$args;
  }

  proxy_pass https://original_content/$1$is_args$args;
}

Proxying and Caching WebP Images Using the Same URI Based on User Accept Headers with Nginx

Case:

  • The proxied image backend serves WebP images when the client requests support it with Accept headers ($http_accept)
  • The backend also provides the same URI for all WebP requests. That means URI like image.png can also be in WebP format

The solution:

  • Using Nginx map module
  • Apply variables to different cache pools

In nginx.conf:

# Proxy cache pools for image caching
proxy_cache_path        /dev/shm/image_cache
                        keys_zone=image_cache:10m;

proxy_cache_path        /dev/shm/image_cache_webp
                        keys_zone=image_cache_webp:10m;

# Differenate WebP requests
map $http_accept $webp_pool {
  default                 image_cache;
  ~*webp                  image_cache_webp;
}

In your site config:

proxy_cache             $webp_pool;

Allow WordPress Embedded Posts with Global X-Frame-Options for Nginx Servers

The problem: when you enables X-Frame-Options globally. You won’t be able to embed your posts with latest WordPress embed posts method.

The solution: you can simply exclude it in your Nginx configuration. I’ll use Nginx map for better performance:

map $request_uri $x_frame_options_headers {
  default                 SAMEORIGIN;
  # Matching WordPress embed page, ie. https://example.com/my-post/embed#?secret=vLi4CQcWkH
  ~/embed                 "";
}

# Don't allow the browser to render the page inside an frame or iframe
add_header X-Frame-Options $x_frame_options_headers;
Embedding Demo

SELinux policy for nginx and GitLab unix socket in Fedora 19

The installation of GitLab in Fedora 19 went fine. I followed the official installation guide with some deviations where necessary, mostly taken from the CentOS guide in gitlab-recipes. I setup nginx using the ssl config, and poked some holes in iptables. For systemd services I used these files.

Source: SELinux policy for nginx and GitLab unix socket in Fedora 19

Configuring NGINX to accept the PROXY Protocol – NGINX

This article explains how to configure NGINX and NGINX Plus to accept the PROXY protocol. Table of Contents Introduction Using the PROXY protocol with SSL, HTTP/2, SPDY, and WebSocket Using the PROXY protocol with a TCP Stream Complete Example Introduction The PROXY protocol enables NGINX and NGINX Plus to receive client connection information passed through […]

Source: Configuring NGINX to accept the PROXY Protocol – NGINX