[front end Engineering] II: use, encapsulation and release of automated construction tools Grunt and Gulp

Automated construction

When developing web applications, there are often some commands that need to be repeated in the development stage

NPM Script is the easiest way to automate the build tool flow

  • "build": "sass scss/main.scss css/style.css --watch",

The preserve hook mechanism of npm script, which executes yarn build before executing yarn serve;

  • "preserve": "yarn build,

Using NPM run all module, multiple script tasks can be executed at the same time// npm install npm-run-all --save-dev

  • "start": "run-p build serve"

Using the Browser Sync module, you can start a local server

  • "serve": "browser-sync",
// package.json file
{
    "name": "my-web-app",
    "version": "0.1.0",
    "main": "index.js",
    "author": "xiapeng",
    "license": "MIT",
    "scripts": {
        "build": "sass scss/main.scss css/style.css --watch",
        // "preserve": "yarn build,
        "serve": "browser-sync",
        "start": "run-p build serve"
    },
    "devDependencies": {
        "browser-sync": "^2.26.7",
        "npm-run-all": "^4.1.5",
        "sass": "^1.22.10"
    }
}

Common automatic tools for construction: Grunt, Gulp, FIS

Grunt is the earliest front-end construction system. Because the working process is based on temporary files, the construction speed will be relatively slow. There will be disk reading and writing operations at each step. In large projects, more files will be processed, which will lead to slow construction speed; Gulp is implemented based on memory. The processing of files is processed in memory. It supports the processing of multiple files at the same time, and the ecology is relatively perfect;

Grunt

// Initialize package JSON file
yarn init --yes 
// Download grunt
yarn add grunt
// Run the grunt task
yarn grunt Task name

Create Task

Create a single task: grunt registerTask

  • grunt.registerTask('default ',' foo ',' bar ') / / create a default task

  • grunt.task.run('foo ',' bar ') / / Foo and bar will be executed automatically after the current task is completed

  • const done = this.async() // done() marks asynchronous tasks. Arrow functions cannot be used in asynchronous function tasks

  • // Grunt's entry file
    // It is used to define some tasks that need to be executed automatically by Grunt
    // You need to export a function
    // This function receives a parameter of the object type of a grunt
    // The grunt object provides some API s that will be used when creating tasks
    
    module.exports = grunt => {
      grunt.registerTask('foo', 'a sample task', () => {
        console.log('hello grunt')
      })
    
      grunt.registerTask('bar', () => {
        console.log('other task')
      })
    
      // //Default is the default task name
      // //It can be omitted when executed through grunt
      // grunt.registerTask('default', () => {
      //   console.log('default task')
      // })
    
      // The second parameter can specify the mapping task of this task,
      // In this way, executing default is equivalent to executing the corresponding task
      // The mapped tasks here will be executed in sequence and will not be executed synchronously
      grunt.registerTask('default', ['foo', 'bar'])
    
      // You can also perform other tasks in the task function
      grunt.registerTask('run-other', () => {
        // foo and bar will be executed automatically after the current task is completed
        grunt.task.run('foo', 'bar')
        console.log('current task runing~')
      })
    
      // The default grunt is encoded in synchronous mode
      // If you need to be asynchronous, you can use this The async () method creates a callback function
      // grunt.registerTask('async-task', () => {
      //   setTimeout(() => {
      //     console.log('async task working~')
      //   }, 1000)
      // })
    
      // Since this is required in the function body, the arrow function cannot be used here
      grunt.registerTask('async-task', function () {
        const done = this.async()
        setTimeout(() => {
          console.log('async task working~')
          done()
        }, 1000)
      })
    }
    

Create a multi-objective task: grunt registerMultiTask

  • You need to configure targets when configuring multi-target tasks

  • grunt.initConfig({ ** })

  • The options of the subtask will override the options of the parent task. You can use this Options();

  • Through this Target to get the current subtask name through this Data to obtain the current subtask value;

  • module.exports = grunt => {
      // The multi-objective mode allows the task to form multiple subtasks according to the configuration
      grunt.initConfig({
        build: {
          options: {
            msg: 'task options'
          },
          foo: {
            options: {
              msg: 'foo target options'
            }
          },
          bar: '456'
        }
      })
    
      grunt.registerMultiTask('build', function () {
        console.log(this.options())
        console.log(this.target, this.data)
      })
    }
    
    

Grunt plugin

  1. Find the plug-in and install it;
  2. Through grunt The loadnpmtasks method loads the plug-in;
  3. At grunt Add configuration options for tasks in initconfig;
module.exports = grunt => {
  grunt.initConfig({
    clean: {
      temp: 'temp/**'
    }
  })
  
  grunt.loadNpmTasks('grunt-contrib-clean')
}

Common plug-ins

grunt-sass

yarn add grunt-sass sass --dev

yarn grunt sass

const sass = require('sass')

module.exports = grunt => {
    grunt.initConfig({
        sass: {
            options: {
                sourceMap: true,
                implementation: sass
            },
            main: {
                files: {
                    'dist/css/main.css': 'src/scss/main.scss'
                }
            }
        }
    })

    grunt.loadNpmTasks('grunt-sass')
}

grunt-babel

yarn add grunt-babel @babel/core @babel/preset-env --dev

yarn grunt babel

  • With more and more task plug-ins, you can use the load grunt tasks plug-in to automatically load all plug-ins, so that you don't have to import them repeatedly;
  • yarn add load-grunt-tasks --dev
  • Preset actually refers to the features you need to use babel to convert. Here @ babel / preset env is selected, that is, convert according to the latest ES features;
const loadGruntTasks = require('load-grunt-tasks')

module.exports = grunt => {
    grunt.initConfig({
        sass: {
            options: {
                sourceMap: true,
                implementation: sass
            },
            main: {
                files: {
                    'dist/css/main.css': 'src/scss/main.scss'
                }
            }
        },
        babel: {
            options: {
                sourceMap: true,
                presets: ['@babel/preset-env']
            },
            main: {
                files: {
                    'dist/js/app.js': 'src/js/app.js'
                }
            }
        },
        watch: {
            js: {
                files: ['src/js/*.js'],
                tasks: ['babel']
            },
            css: {
                files: ['src/scss/*.scss'],
                tasks: ['sass']
            }
        }
    })

    loadGruntTasks(grunt) // Automatically load all tasks in the grunt plug-in

    grunt.registerTask('default', ['sass', 'babel', 'watch'])
}

grunt-contrib-watch

yarn add grunt-contrib-watch --dev

yarn grunt watch

  • Map all tasks to the default task;
  • grunt.registerTask('default', ['sass', 'babel', 'watch'])
watch: {
    js: {
        files: ['src/js/*.js'], // Monitored files
        tasks: ['babel'] // Execute after monitoring file changes
    },
    css: {
         files: ['src/scss/*.scss'],
         tasks: ['sass']
    }
}

grunt.registerTask('default', ['sass', 'babel', 'watch'])

Gulp

yarn add gulp --dev

New gulpfile JS file configuration gulp

yarn gulp foo

  • The latest version of gulp defaults to asynchronous tasks
  • Receive a done parameter as a callback function to mark the end of the task
// //All exported functions are treated as gulp tasks
// exports.foo = () => {
//   console.log('foo task working~')
// }

// gulp's task functions are asynchronous
// The completion of the task can be identified by calling the callback function
exports.foo = done => {
  console.log('foo task working~')
  done() // Identifies the completion of task execution
}

// Default is the default task
// The task name parameter can be omitted at run time
exports.default = done => {
  console.log('default task working~')
  done()
}

// v4.0 needs to pass gulp The task () method registers the task
const gulp = require('gulp')

gulp.task('bar', done => {
  console.log('bar task working~')
  done()
})

Gulp composite task

series are serial tasks, which are executed in sequence;

Parallel is a parallel task, which is executed concurrently;

const { series, parallel } = require('gulp')

const task1 = done => {
  setTimeout(() => {
    console.log('task1 working~')
    done()
  }, 1000)
}

const task2 = done => {
  setTimeout(() => {
    console.log('task2 working~')
    done()
  }, 1000)  
}

const task3 = done => {
  setTimeout(() => {
    console.log('task3 working~')
    done()
  }, 1000)
}

// Let multiple tasks execute in order
exports.foo = series(task1, task2, task3)

// Let multiple tasks execute at the same time
exports.bar = parallel(task1, task2, task3)

Asynchronous tasks for Gulp

There are three main ways:

  1. Receive the done parameter and execute the callback function through the callback function; When the callback function marks an error, you can pass in the error parameter, done(new Error('task failed ');

  2. You can return a promise object, return promise resolve(); Return an error promise object, return promise reject(new Error(‘task failed’));

  3. You can use async and await to create a promise object;

const fs = require('fs')

exports.callback = done => {
  console.log('callback task')
  done()
}

exports.callback_error = done => {
  console.log('callback task')
  done(new Error('task failed'))
}

exports.promise = () => {
  console.log('promise task')
  return Promise.resolve()
}

exports.promise_error = () => {
  console.log('promise task')
  return Promise.reject(new Error('task failed'))
}

const timeout = time => {
  return new Promise(resolve => {
    setTimeout(resolve, time)
  })
}

exports.async = async () => {
  await timeout(1000)
  console.log('async task')
}

In addition to the above three methods, there are the following methods:

File stream mode

  • Returning the file stream is because gulp will register the end event for the file stream, so gulp can get whether the task is completed or not. Therefore, you can also manually simulate and register the end event;
exports.stream = () => {
  const read = fs.createReadStream('yarn.lock')
  const write = fs.createWriteStream('a.txt')
  read.pipe(write)
  return read
}

// exports.stream = done => {
//   const read = fs.createReadStream('yarn.lock')
//   const write = fs.createWriteStream('a.txt')
//   read.pipe(write)
//   read.on('end', () => {
//     done()
//   })
// }

Core working principle of Gulp construction process

Simulate the build process through the api of file stream

Most of the construction process is to read out the file, then do some conversion, and finally write to another location;

const fs = require('fs')
const { Transform } = require('stream')

exports.default = () => {
  // File read stream
  const readStream = fs.createReadStream('normalize.css')

  // File write stream
  const writeStream = fs.createWriteStream('normalize.min.css')

  // File conversion stream
  const transformStream = new Transform({
    // Core conversion process
    transform: (chunk, encoding, callback) => {
      const input = chunk.toString()
      const output = input.replace(/\s+/g, '').replace(/\/\*.+?\*\//g, '')
      callback(null, output)
    }
  })

  return readStream
    .pipe(transformStream) // transformation
    .pipe(writeStream) // write in
}

Gulp file manipulation API

Read and write files

  • const { src, dest } = require('gulp')
const { src, dest } = require('gulp')
const cleanCSS = require('gulp-clean-css')
const rename = require('gulp-rename')

exports.default = () => {
  return src('src/*.css')
    .pipe(cleanCSS())
    .pipe(rename({ extname: '.min.css' }))
    .pipe(dest('dist'))
}

Gulp case

According to the business requirements, realize the automatic construction flow of a web application

Style compilation

yarn add sass gulp-sass --dev

  • The src configuration option base can set the path of the folder, so that it will be exported according to the original path when writing out;
  • src('src/assets/styles/*.scss', { base: 'src' })
  • With_ The style file at the beginning will not be compiled by default;
  • sass configures {outputStyle: 'expanded'} so that all styles are expanded;
// gulpfile.js

const { src, dest } = require('gulp') 
const sass = require ( 'gulp-sass' ) ( require ( 'sass' ) )

const style = () => {
  return src('src/assets/styles/*.scss', { base: 'src' })
    .pipe(sass({ outputStyle: 'expanded' }))
    .pipe(dest('dist'))
}

module.exports = {
  style
}

Script Compilation

yarn add gulp-babel @babel/core @babel/preset-env --dev

  • babel needs to configure presets and download the core code core
const { src, dest } = require('gulp') 
const babel = require('gulp-babel')

const script = () => {
  return src('src/assets/scripts/*.js', { base: 'src' })
    .pipe(babel({ presets: ['@babel/preset-env'] }))
    .pipe(dest('dist'))
}

module.exports = {
  script
}

Page template compilation

yarn add gulp-swig --dev

  • swig({ data }); Template engine, incoming data
  • swig caches html by default. You can configure not to cache to support hot updates. Defaults: {cache: false}
const { src, dest } = require('gulp') 
const swig = require('gulp-swig')

const data = {...}

const page = () => {
  return src('src/**/*.html')
    .pipe(swig({ data, defaults: { cache: false } }))
    .pipe(dest('dist'))
}

module.exports = {
  page
}

Merge the above three compilation tasks into a parallel task

const { src, dest, parallel } = require('gulp') 
const compile = parallel(style,script,page)

module.exports = {
  compile
}

*Image and font file conversion

const imagemin = require('gulp-imagemin')

//There will be some problems downloading gulp imagemin in China

  • Non images in font file conversion will be copied directly
const image = () => {
  return src('src/assets/images/**', { base: 'src' })
    .pipe(imagemin())
    .pipe(dest('dist'))
}

const font = () => {
  return src('src/assets/fonts/**', { base: 'src' })
    .pipe(imagemin())
    .pipe(dest('dist'))
}

Other documents and document clearing

yarn add del --dev

const del = require('del')

const extra = () => {
  return src('public/**', { base: 'public' })
    .pipe(dest('dist'))
}

const clean = () => {
  return del(['dist'])
}

const build = series(clean, parallel(compile, extra))

module.exports = {
  build
}

Auto load plug-ins

yarn add gulp-load-plugins

  • All gulp plug-ins will become the properties under plugins in the form of a small hump. Use plugins To use
const loadPlugins = require('gulp-load-plugins')
const plugins = loadPlugins()

Development server

yarn add browser-sync --dev

  • Server the following is the directory configuration of the development server
  • notify is the notification in the upper right corner
  • Files is to listen to the files in the current directory and update them automatically when modifying
const browserSync = require('browser-sync')
const bs = browserSync.create()

const serve = () => {
  bs.init({
    notify: false,
    files: 'dist/**',
    server: {
      baseDir: 'dist',
      routes: {
        '/node_modules': 'node_modules'
      }
    }
  })
}

module.exports = {
  serve
}

Monitor changes and build optimizations

gulp provides the API of watch, which is used to listen for file changes to execute corresponding tasks. The first parameter is the path, and the second parameter is the execution task;

  • swig needs to be configured without caching

  • You can not configure files: 'dist / * *' listening file to change overload, but configure it separately after each build task that needs to be reloaded pipe(bs.reload({ stream: true }))

  • In the development environment, the images and fonts are only compressed, and the display effect will not be affected before and after construction. Therefore, you can reduce the number of construction times. You can directly read the files in the src directory and configure baseDir: ['dist', 'src', 'public]. When the corresponding files cannot be found in the dist directory, you will look in the src and public folders in turn; At this time, static files such as pictures and fonts are modified. There is no need to rebuild, just reload bs;

    watch([
        'src/assets/images/**',
        'src/assets/fonts/**',
        'public/**'
    ], bs.reload)
    
  • Create a new development task, build the project first, and start the development server

Optimized gulpfile JS file:

const { src, dest, parallel, series, watch } = require('gulp') 

const page = () => {
  return src('src/**/*.html')
    .pipe(plugins.swig({ data, defaults: { cache: false } }))
    .pipe(dest('dist'))
    .pipe(bs.reload({ stream: true }))
}

const serve = () => {
  // Listen for changes to the following three files and re execute the build task
  // When dist changes, the bs will reload because we have configured files
  // You can also manually BS reload  . pipe(bs.reload({ stream: true }))
  watch('src/assets/styles/*.scss', style)
  watch('src/assets/scripts/*.js', script)
  watch('src/**/*.html', page)
  // In the development environment, the images and fonts are only compressed, and the display effect will not be affected before and after construction, so the construction times can be reduced without rebuilding, and only the bs needs to be reloaded
  watch([
    'src/assets/images/**',
    'src/assets/fonts/**',
    'public/**'
  ], bs.reload)

  bs.init({
    files: 'dist/**',
    notify: false,
    server: {
      baseDir: ['dist', 'src', 'public'],
      routes: {
        '/node_modules': 'node_modules'
      }
    }
  })
}

const compile = parallel(style,script,page, image, font)

const build = series(clean, parallel(compile, extra))

const develop = series(compile, serve)

module.exports = {
  build,
  serve,
  develop
}

useref file reference processing

yarn add gulp-useref --dev

  • For the path problem in the built project, you can use gulp useref to merge all the reference files

  • Use the annotation method to merge all files

    <!-- build:css assets/styles/vendor.css -->
    <link rel="stylesheet" href="/node_modules/bootstrap/dist/css/bootstrap.css">
    <!-- endbuild -->
    <!-- build:css assets/styles/main.css -->
    <link rel="stylesheet" href="assets/styles/main.css">
    <!-- endbuild -->
    
    <!-- build:js assets/scripts/vendor.js -->
    <script src="/node_modules/jquery/dist/jquery.js"></script>
    <script src="/node_modules/popper.js/dist/umd/popper.js"></script>
    <script src="/node_modules/bootstrap/dist/js/bootstrap.js"></script>
    <!-- endbuild -->
    
    // Converted file
    <link rel="stylesheet" href="assets/styles/vendor.css">
    <link rel="stylesheet" href="assets/styles/main.css">
    
    <script src="assets/scripts/vendor.js"></script>
    
  • searchPath specifies dist / * The search path for references in html is as follows: first find them from the dist directory, and go if you can't find them The root directory cannot be found;

const useref = () => {
  return src('dist/*.html', { base: 'dist' })
    .pipe(plugins.useref({ searchPath: ['dist', '.'] }))
    .pipe(dest('dist'))
}

File compression

When using useref plug-in, you can perform some operations, such as compressing HTML, CSS and JS files;

yarn add gulp-htmlmin gulp-uglify gulp-clean-css --dev

They are plug-ins for compressing HTML, JS and CSS;

  • At this time, the files in the pipeline have three formats: HTM, CSS and JS. You need to use different plug-ins for files in different formats;
  • yarn add gulp-if --dev
  • At this time, there will be a problem in writing to dist while reading dist, so use the new directory release, but using release will break the build file structure;
const useref = () => {
  return src('dist/*.html', { base: 'dist' })
    .pipe(plugins.useref({ searchPath: ['dist', '.'] }))
    .pipe(plugins.if(/\.js$/, plugins.uglify()))
    .pipe(plugins.if(/\.css$/, plugins.cleanCss()))
    .pipe(plugins.if(/\.html$/, plugins.htmlmin({ collapseWhitespace: true, minifyCss: true, minifyJs: true })))
    // .pipe(dest('dist'))
    .pipe(dest('release'))
}

Re planning the build process

Because the use of useref breaks the structure of the build file, you need to re plan the build process;

  • The final packaging directory is dist directory

  • During the development process, the local server runs the temp directory, and the style, script and page tasks are written to the temp directory;

  • useref compression and file import tasks listen to the temp directory and write to the dist directory;

  • Integrate task execution into npm script;

  • "scripts": {
        "clean": "gulp clean",
        "build": "gulp build",
        "dev": "gulp develop"
    },
    

Gulpfile after final re planning JS file:

const { src, dest, parallel, series, watch } = require('gulp') 

const del = require('del')
const browserSync = require('browser-sync')

const loadPlugins = require('gulp-load-plugins')
const plugins = loadPlugins()
const bs = browserSync.create()

const sass = require ( 'gulp-sass' ) ( require ( 'sass' ) )
// const babel = require('gulp-babel')
// const swig = require('gulp-swig')
// const imagemin = require('gulp-imagemin')

// Data passed in by template engine
const data = {...}

const clean = () => {
  return del(['dist', 'temp'])
}


const style = () => {
  return src('src/assets/styles/*.scss', { base: 'src' })
    .pipe(sass({ outputStyle: 'expanded' }))
    .pipe(dest('temp'))
    .pipe(bs.reload({ stream: true }))
}

const script = () => {
  return src('src/assets/scripts/*.js', { base: 'src' })
    .pipe(plugins.babel({ presets: ['@babel/preset-env'] }))
    .pipe(dest('temp'))
    .pipe(bs.reload({ stream: true }))
}

const page = () => {
  return src('src/**/*.html')
    .pipe(plugins.swig({ data, defaults: { cache: false } }))
    .pipe(dest('temp'))
    .pipe(bs.reload({ stream: true }))
}

const image = () => {
  return src('src/assets/images/**', { base: 'src' })
    // .pipe(imagemin())
    .pipe(dest('dist'))
}

const font = () => {
  return src('src/assets/fonts/**', { base: 'src' })
    // .pipe(imagemin())
    .pipe(dest('dist'))
}

const extra = () => {
  return src('public/**', { base: 'public' })
    .pipe(dest('dist'))
}

const serve = () => {
  // Listen for changes to the following three files and re execute the build task
  // When dist changes, the bs will reload because we have configured files
  // You can also manually BS reload  . pipe(bs.reload({ stream: true }))
  watch('src/assets/styles/*.scss', style)
  watch('src/assets/scripts/*.js', script)
  watch('src/**/*.html', page)
  // In the development environment, the images and fonts are only compressed, and the display effect will not be affected before and after construction, so the construction times can be reduced without rebuilding, and only the bs needs to be reloaded
  watch([
    'src/assets/images/**',
    'src/assets/fonts/**',
    'public/**'
  ], bs.reload)

  bs.init({
    // files: 'dist/**',
    notify: false,
    server: {
      baseDir: ['temp', 'src', 'public'],
      routes: {
        '/node_modules': 'node_modules'
      }
    }
  })
}

// Before compression, the compile task should be executed first, because the comments in dist have been converted for the first time
// It is not recommended to write dist while reading dist
const useref = () => {
  return src('temp/*.html', { base: 'temp' })
    .pipe(plugins.useref({ searchPath: ['temp', '.'] }))
    .pipe(plugins.if(/\.js$/, plugins.uglify()))
    .pipe(plugins.if(/\.css$/, plugins.cleanCss()))
    .pipe(plugins.if(/\.html$/, plugins.htmlmin({ collapseWhitespace: true, minifyCss: true, minifyJs: true })))
    .pipe(dest('dist'))
}

const compile = parallel(style, script, page)

// Tasks performed before going online
const build = series(clean, parallel(series(compile, useref), image, font, extra))

const develop = series(compile, serve)

module.exports = {
  clean,
  build,
  develop,
}

Encapsulate workflow

Gulpfile written at one time JS wants to be encapsulated into the structure of multiple project reuse. If it is not convenient for secondary modification in the notes, it is encapsulated into a set of workflow in combination with engineering tools for reuse;

  • package. The main attribute in JSON is the specified entry file. This attribute needs to be supported when referencing or developing a dependent package, otherwise the dependent package cannot be imported by import in the project;
  • Self developed dependency packages usually prefer to put the entry file in index. In the lib directory JS file;

Extract gulpfile

  • New workflow directory folder

  • Initialize package JSON file, put the original gulp development dependency into the production dependency of workflow, and configure the entry file

    "dependencies": {
        "@babel/core": "^7.16.7",
        "@babel/preset-env": "^7.16.7",
        "gulp-babel": "^8.0.0",
        "gulp-imagemin": "^8.0.0",
        "gulp-swig": "^0.9.1",
        "sass": "^1.47.0",
        "browser-sync": "^2.27.7",
        "del": "^6.0.0",
        "gulp": "^4.0.2",
        "gulp-clean-css": "^4.3.0",
        "gulp-htmlmin": "^5.0.1",
        "gulp-if": "^3.0.0",
        "gulp-load-plugins": "^2.0.7",
        "gulp-sass": "^5.1.0",
        "gulp-uglify": "^3.0.2",
        "gulp-useref": "^5.0.0"
      },
    "main": "lib/index.js",
    
  • Set gulpfile JS file content is copied to the entry file as the entry file of the workflow, and gulpfile.js file is created in the new project JS file

    // Gulpfile for the new project JS file
    module.exports = require('xp-web')
    
  • Configure script in new project

    "scripts": {
        "clean": "gulp clean",
        "build": "gulp build",
        "dev": "gulp develop"
    },
    

Solve problems in the module

Extract gulpfile When running yarn build after JS file, you will find that the data data of the template engine cannot be found, and the data data should be exposed to the user rather than written in the tool flow. At this time, you can consider creating a configuration file in the new project and reading the configuration file in the tool flow;

  • Create a new configuration file page. In a new project config. js

    module.exports = {
        data: {
            menus: [
              {
                name: 'Home',
                icon: 'aperture',
                link: 'index.html'
              },
              {
                name: 'Features',
                link: 'features.html'
              },
              {
                name: 'About',
                link: 'about.html'
              },
              {
                name: 'Contact',
                link: '#',
                children: [
                  {
                    name: 'Twitter',
                    link: 'https://twitter.com/w_zce'
                  },
                  {
                    name: 'About',
                    link: 'https://weibo.com/zceme'
                  },
                  {
                    name: 'divider'
                  },
                  {
                    name: 'About',
                    link: 'https://github.com/zce'
                  }
                ]
              }
            ],
            pkg: require('./package.json'),
            date: new Date()
          }
    }
    
  • Replace data with dynamic import configuration file in the entry file

    // Returns the working directory where the current command line is located
    const cwd = process.cwd()
    // Default configuration
    let config = {}
    
    try {
        const loadConfig = require(`${cwd}/pages.config.js`)
        config = Object.assign({}, config, loadConfig)
    } catch (e){}
    

The original gulpfile The '@ Babel / preset env' used in JS will go to the node of the working directory_ Modules, but it cannot be found in the tool flow at this time. We need to modify the loading method of '@ Babel / preset env' to require('@ Babel / preset env'), so that it will look up in turn instead of just looking in the working directory

  • const script = () => {
      return src('src/assets/scripts/*.js', { base: 'src' })
        .pipe(plugins.babel({ presets: [require('@babel/preset-env')] }))
        .pipe(dest('temp'))
        .pipe(bs.reload({ stream: true }))
    }
    

Abstract path configuration

Expose all the dead paths to the configuration file and write the default configuration

  • The configuration cwd attribute of src is used to specify the directory from which src starts searching. The default is the current working directory
let config = {
  build: {
    src: 'src',
    dist: 'dist',
    temp: 'temp',
    public: 'public',
    paths: {
      styles: 'assets/styles/*.scss',
      scripts: 'assets/scripts/*.js',
      pages: '*.html',
      images: 'assets/images/**',
      fonts: 'assets/fonts/**'
    }
  }
}
// Abstract path configuration and configure cwd
const clean = () => {
  return del([config.build.dist, config.build.temp])
}
const style = () => {
  return src(config.build.paths.styles, { base: 'src', cwd: config.build.src })
    .pipe(sass({ outputStyle: 'expanded' }))
    .pipe(dest('temp'))
    .pipe(bs.reload({ stream: true }))
}
const script = () => {
  return src(config.build.paths.scripts, { base: config.build.src, cwd: config.build.src })
    .pipe(plugins.babel({ presets: [require('@babel/preset-env')] }))
    .pipe(dest(config.build.temp))
    .pipe(bs.reload({ stream: true }))
}

const page = () => {
  return src(config.build.paths.pages, { cwd: config.build.src })
    .pipe(plugins.swig({ data: config.data, defaults: { cache: false } }))
    .pipe(dest(config.build.temp))
    .pipe(bs.reload({ stream: true }))
}

const image = () => {
  return src(config.build.paths.images, { base: config.build.src, cwd: config.build.src })
    // .pipe(imagemin())
    .pipe(dest(config.build.dist))
}

const font = () => {
  return src(config.build.paths.fonts, { base: config.build.src, cwd: config.build.src })
    // .pipe(imagemin())
    .pipe(dest(config.build.dist))
}

const extra = () => {
  return src('**', { base: config.build.public, cwd: config.build.public })
    .pipe(dest(config.build.dist))
}

const serve = () => {
  // Listen for changes to the following three files and re execute the build task
  // When dist changes, the bs will reload because we have configured files
  // You can also manually BS reload  . pipe(bs.reload({ stream: true }))
  watch(config.build.paths.styles, { cwd: config.build.src }, style)
  watch(config.build.paths.scripts, { cwd: config.build.src }, script)
  watch(config.build.paths.pages, { cwd: config.build.src }, page)
  // In the development environment, the images and fonts are only compressed, and the display effect will not be affected before and after construction, so the construction times can be reduced without rebuilding, and only the bs needs to be reloaded
  watch([
    config.build.paths.images,
    config.build.paths.fonts,
  ],{ cwd: config.build.src }, bs.reload)
  watch([
    '**'
  ],{ cwd: config.build.public }, bs.reload)

  bs.init({
    // files: 'dist/**',
    notify: false,
    server: {
      baseDir: [config.build.temp, config.build.src, config.build.public],
      routes: {
        '/node_modules': 'node_modules'
      }
    }
  })
}

// Before compression, the compile task should be executed first, because the comments in dist have been converted for the first time
// It is not recommended to write dist while reading dist
const useref = () => {
  return src(config.build.paths.pages, { base: config.build.temp, cwd: config.build.temp })
    .pipe(plugins.useref({ searchPath: [config.build.temp, '.'] }))
    .pipe(plugins.if(/\.js$/, plugins.uglify()))
    .pipe(plugins.if(/\.css$/, plugins.cleanCss()))
    .pipe(plugins.if(/\.html$/, plugins.htmlmin({ collapseWhitespace: true, minifyCss: true, minifyJs: true })))
    .pipe(dest(config.build.dist))
}

New project's configuration file pages config. js

module.exports = {
  build: {
    src: 'src',
    dist: 'release',
    temp: '.tmp',
    public: 'public',
    paths: {
      styles: 'assets/styles/*.scss',
      scripts: 'assets/scripts/*.js',
      pages: '*.html',
      images: 'assets/images/**',
      fonts: 'assets/fonts/**'
    }
  },
  data: {
    menus: [
      {
        name: 'Home',
        icon: 'aperture',
        link: 'index.html'
      },
      {
        name: 'Features',
        link: 'features.html'
      },
      {
        name: 'About',
        link: 'about.html'
      },
      {
        name: 'Contact',
        link: '#',
        children: [
          {
            name: 'Twitter',
            link: 'https://twitter.com/w_zce'
          },
          {
            name: 'About',
            link: 'https://weibo.com/zceme'
          },
          {
            name: 'divider'
          },
          {
            name: 'About',
            link: 'https://github.com/zce'
          }
        ]
      }
    ],
    pkg: require('./package.json'),
    date: new Date()
  }
}

Packaging Gulp CLI

Gulpfile. In the new project The only function of JS is to introduce our own workflow

module.exports = require('xp-web')

Consider removing gulpfile JS file, encapsulate gulpfile into our own workflow, and an error will be reported at this time

[the external chain image transfer fails. The source station may have an anti-theft chain mechanism. It is recommended to save the image and upload it directly (img-gmdqxuni-1641735897755) (C: \ users \ administrator \ appdata \ roaming \ typora \ typora user images \ image-20220109195742182. PNG)]

  • The gulp command provides the – gulpfile configuration attribute to specify gulpfile JS path, – cwd configuration attribute to specify the working directory;
  • gulp build --gulpfile ./node_modules/xp-web/lib/index.js --cwd .

However, there is too much content in the command line. Consider encapsulating your cli

  • To encapsulate the CLI for your own workflow, create XP web.com under the bin directory JS file, and in package Configure the bin attribute in JSON to specify the CLI directory

  • CLI files must be in #/ Start with usr/bin/env node

  • process.argv is an instruction added to the command line, for example: gulp build -- gulpfile/ node_ modules/xp-web/lib/index. js --cwd . – gulpfile and – CWD in

  • "bin": "bin/xp-web",
        
    // xp-web.js
    #!/usr/bin/env node
    
    process.argv.push('--cwd')
    process.argv.push(process.cwd())
    process.argv.push('--gulpfile')
    process.argv.push(require.resolve('..'))
    
    require('gulp/bin/gulp')
    
  • At this point, you need to re yarn link and register the CLI with the global

Publish and use modules

npm pubilsh will default to the root directory and package Folder publishing under the files attribute in JSON

When we publish to Npm, there may be a time difference when downloading using Taobao image source. You can go to https://npmmirror.com/ View and click SYNC to synchronize updates

Here I uploaded my own XP web for your reference. You can download it by using NPM I XP web;

Keywords: Front-end gulp

Added by OU_Student on Sun, 09 Jan 2022 16:12:44 +0200