May 22, 2021 Mini Program Cloud Development Advanced
Since cloud functions are closely related to Nodejs, we need to have some basic knowledge of cloud functions and Node's modules and some basic knowledge of Nodejs. Here are just a few basic concepts, and if you'd like to delve into them in more detail, it's a good idea to go through the official technical documentation for Nodejs:
Technical documentation: Nodejs API Chinese technical documentation
In the previous we have been in contact with Nodejs fs module, path module, these we call Nodejs built-in modules, built-in modules do not need us to use npm install download, you can directly use require introduction:
const fs = require('fs')
const path = require('path')
const url = require('url')
Nodejs' commonly used built-in modules, as well as features, are available directly in cloud functions:
Similar to JavaScript's Global Object, Nodejs also has a global object global, which, along with all its properties (some of which are properties of global objects), can be accessed anywhere in the program. Here's a look at the global variables that Nodejs use more often in cloud functions.
Dirname is the full directory name of the directory where the current execution file is located, and node has another common variable, filename, which is the file name with the full absolute path to the current execution file. We can create a new cloud function, such as nodefile, and then enter the following code in the .js of the nodefile cloud function:
const cloud = require('wx-server-sdk')
cloud.init({
env: cloud.DYNAMIC_CURRENT_ENV
})
exports.main = async (event, context) => {
console.log('当前执行文件的文件名', __filename );
console.log('当前执行文件的目录名', __dirname );
}
After you upload a cloud function deployment, you can execute the cloud function with a small terminal call, on-premises debugging, or cloud testing to get the following print results (remember where the print logs of the cloud function can be viewed?). ):
当前执行文件的文件名 /var/user/index.js
当前执行文件的目录名 /var/user
This shows that the cloud function is placed in the
/var/user
in the cloud.
There are also variables such as module, module.exports, exports, etc. that are actually local variables inside the module, and the objects they point to vary from module to module, but because they are common to all modules, they can also be considered global variables.
The
/
prefix is the absolute path of the file, and putting it in the cloud function
require('/var/user/config/config.js')
loads the config.js in the config folder in the cloud function directory, where the require
require('/var/user/config/config.js')
is equivalent to the relative need path in the path of the cloud function
require('./config/config.js')
W
hen there is no '/,./' or '.
When /' begins to represent a file, the module must be a core module or a node_modules directory.
Create a new config folder under the directory of the nodefile cloud function, create a config folder in the .js, and the directory structure of the cloud function looks like this:
nodefile // 云函数目录
├── config //config文件夹
│ └── config.js //config.js文件
└── index.js
└── config.json
└── package.json
Then enter the following .js in the config code, which we usually use to state some sensitive information, or more generic modules:
module.exports = {
AppID: 'wxda99ae45313257046', //可以是其他变量,这里只是参考
AppKey: 'josgjwoijgowjgjsogjo',
}
Then enter the following code in the index .js of the nodefile cloud function (the following is not the actual code, you look at the addition):
//下面两句放在exports.main函数的前面
const config = require('./config/config.js')
const {AppID,AppKey} = config
//省略了部分代码
exports.main = async (event, context) => {
console.log({AppID,AppKey})
}
Once all the files of the cloud function have been deployed and uploaded to the cloud, and then the cloud function is executed, we can see that the variables in the config/config.js are passed into the index.js, which also shows that under the cloud function directory you can not only create files (previously created pictures), but also create modules that can be created and introduced through modee.exports and require.
The process object provides information .js and controls the current Node process, and it has a more important property, process.env, that returns objects that contain the user environment.
For example, the nodefile cloud function above, open the cloud development console, find nodefile in the list of cloud functions, and then click Configure to add some environment variables in the environment variables of the spring window, such as NODE_ENV, ENV_ID, name (because it is constant, it is recommended to use capital letters), its value is a string, and then we change the index .js code of the nodefile cloud function to the following:
const cloud = require('wx-server-sdk')
cloud.init({
env: cloud.DYNAMIC_CURRENT_ENV
})
exports.main = async (event, context) => {
return process.env //process可以不必使用require就可以直接用
}
After the right-click cloud function is incrementally uploaded, the cloud function is called, and then you can see in the return object of the cloud function that there is some information about the cloud function environment in addition to the variables we set. So we can add some variables that need to be manually modified or relatively private to the configuration, and then call them in the cloud function, for example, we want to modify the cloud development environment of the small program after the small program is online, we can add ENV_ID fields, the value will be modified according to the situation:
const cloud = require('wx-server-sdk')
const {ENV_ID} = process.env
cloud.init({
env: ENV_ID
})
Review wx-server-sdk, a third-party module that is also a core dependency of cloud development, on which many APIs are based. We can open the node modules folder of the cloud function on the computer after installing wx-server-sdk for the cloud function (i.e., the right-click cloud function, which executes npm install at the terminal), and we can see that although only one wx-server-sdk is installed, many modules have been downloaded, all of which are installed through three core dependencies@cloudbase/node-sdk (formerly tcb-admin-node), protobuf, and jstslib.
To gain an in-depth understanding of wx-server-sdk, we can look at the core @cloudbase/node-sdk (formerly tcb-admin-node), refer to @cloudbase/node-sdk's Github website, and because wx-server-sdk has downloaded a lot of dependencies, such as @cloudbase/node-sdk, xml2js, request, etc., these dependencies can be introduced directly into the cloud function.
const request = require('request')
Although the request module is a third-party module, it has been downloaded through wx-server-sdk and can be introduced directly through require in the cloud function.
Since the wx-server-sdk module is downloaded and installed for every cloud function, we can treat it as a built-in module of the cloud function, and with the N multiple dependencies downloaded by wx-server-sdk, we can
npm install
downloading them again, and view the version information of those dependencies in the package-lock.json after the installation with npm install is complete.
Nodejs Eco has the most third-party modules of all programming languages, more than Python, PHP, Java, with the help of these open source modules, can greatly save our development costs, these modules in the npm official address can be searched, but npm official website third-party modules are large and complete, which are the Most commonly used by Nodejs developers the best modules? We can find awesome Nodejs on Github, which has very comprehensive recommendations.
In awesome-nodejs, these excellent modules are divided into nearly 50 different categories, most of which can be used for cloud functions, so the power of cloud functions doesn't just stay on the technical documentation of cloud development, we'll pick some of the more representative modules in this chapter to explain in conjunction with cloud functions.
When we want to introduce a third-party module into a cloud function, we need to first add the module to the dependencies in the cloud function package.json with the
"第三方模块名": "版本号"
number is represented by many methods, npm install will download the corresponding version (list only some of the more common):
latest
the latest version of the module will be downloaded;
1.2.x
equivalent to 1.2, will download the version of the version of the 1.2.0-lt;3.0.0;
~1.2.4
and lt;1.3.0 will be downloaded;
^1.2.4
will be downloaded
For example, if we want to introduce the latest version of lodash into the cloud function, we can add "lodash" to the cloud function
"lodash": "latest"
note that it is added to the dependencies property, and the writing of the package.json should also meet the format requirements of the configuration file, especially note that the last item cannot have a
,
you cannot write comments in the jason profile:
"dependencies": {
"lodash": "latest"
}
A
npm install
file is generated at npm install to record the specific source and version number of each npm package actually installed in the current state.
Different version numbers can have a different effect on the results of the run, so in order to ensure that the version is consistent with the pack-lock.json, we usually use the latest ones.
Cloud functions run in a service-side Linux environment, a cloud function in the processing of compound requests will create multiple instances of cloud functions, each cloud function instance isolated from each other, there is no common memory or hard disk space, so each cloud function is also isolated from each other, so each cloud function we have to download their own dependence, can not do re-use.
The creation, management and destruction of cloud function instances are done automatically by the platform. E ach instance of a cloud function provides a 512MB piece of temporary disk space under the /tmp directory (here is the absolute path/tmp of the service side, not the ./tmp of the cloud function directory) to handle the temporary file read and write requirements during the execution of a single cloud function. If you want persistent storage, it's best to use cloud storage.
Cloud functions should be stateless, that is, the execution of a cloud function does not depend on the information that remained in the running environment during the execution of the last cloud function. T o ensure load balancing, the cloud function platform controls the number of cloud function instances based on the current load and in some cases reuses the cloud function instances, which allows two consecutive cloud function calls to share the same temporary disk space if they are run by the same cloud function instance, but because the cloud function instances can be destroyed at any time, and successive requests do not necessarily fall on the same instance (because multiple instances are created at the same time), T herefore, cloud functions should not rely on data left over from temporary disk space in previous cloud function calls. The general principle is that cloud function code should be stateless.
return
returns, and the behavior of the normal node on-premises operation is somewhat different, this should be noted;
/tmp
the directory of the cloud function is not written permissions;
await
the task is not completed, the cloud function is terminated;