Coding With Fun
Home Docker Django Node.js Articles Python pip guide FAQ Policy

The operation of the cloud development file system


May 22, 2021 Mini Program Cloud Development Advanced



First, read the files on the server side of the cloud function

Through nodejs modules, we can implement a certain interaction between cloud functions and service-side file systems, such as using cloud-side images to read the service-side images first using fs.createReadStream and then uploading them to cloud storage. Nodejs' file processing power allows cloud functions to manipulate server-side files, such as file lookups, reads, writes, and even code compilation.

Or take the nodefile cloud function as an example, using weChat developer tools to create a new folder under the nodefile cloud function, such as aassets, and then put demo.jpg picture files and index.html web files in the asats, etc., the directory structure is as follows:

  1. nodefile // 云函数目录
  2. ├── config //config文件夹
  3. └── config.js //config.js文件
  4. ├── assets //assets文件夹
  5. └── demo.jpg
  6. └── index.html
  7. └── index.js
  8. └── config.json
  9. └── package.json

Then enter the following code in the index .js the nodefile cloud function and read the files in the cloud function directory using fs.createReadStream:

  1. const cloud = require('wx-server-sdk')
  2. const fs = require('fs')
  3. const path = require('path')
  4. cloud.init({
  5. env: cloud.DYNAMIC_CURRENT_ENV
  6. })
  7. exports.main = async (event, context) => {
  8. const fileStream = fs.createReadStream(path.join(__dirname, './assets/demo.jpg'))
  9. return await cloud.uploadFile({
  10. cloudPath: 'demo.jpg',
  11. fileContent: fileStream,
  12. })
  13. }

Second, the file operation module introduction

The above case uses the fs module essential for Nodejs file processing, the fs module: you can create, delete, query, and read and write files to the file directory:

  • Read file, fs.readFile()

  • Create a file, fs.appendFile (), fs.open (), fs.writeFile ()

  • Update files, fs.appendFile (), fs.writeFile()

  • Delete files, fs.unlink()

  • Rename file, fs.rename()

Here are just a few of the methods of the fs module, about how to use it, you can refer to the official Technical Documentation of Nodejs and, of course, fs. S tats class, encapsulating file information-related operations; D ir class, encapsulating operations related to the file directory; Dirent class, encapsulating the related actions of directory items, and so on.

Methods in the Nodejs fs module have asynchronous and synchronous versions, such as asynchronous fs.readFile() and synchronized fs.readFileSync() for reading the contents of a file. The last argument of the asynchronous method function is the callback function callback, which contains error information in the parameters of the callback function, and it is generally recommended that you use the asynchronous method for higher performance, faster speed, and no blocking.

When you manipulate a file, you inevitably use the path module, the path module: provides some APIs for working with file paths, and its common methods are:

  • path.basename(), get the Chinese of the path;

  • path.delimiter(), returns the directory separator in the operating system;

  • path.dirname(), get the directory name in the path;

  • path.extname(), get the extension in the path;

  • path.join(), path combination, merge;

  • path.normalize(), path resolution, get normalized path format

Node reads files in two ways, using fs.readFile and streaming fs.createReadStream. I f the file you want to read is small, we can use fs.readFile, fs.readFile to read the file is to read the file to local memory at once. A nd if you read a large file, such as when the file exceeds 16M or so (the larger the file performance will be the larger), a one-time read will use a lot of memory, the efficiency is relatively low, this time need to use the stream to read. A stream is a read that splits file data into segments and controls the rate, which is efficient and does not consume too much memory. Whether the file is large or small, we can use fs.createReadStream to read the file.

To make it clearer to everyone, let's look at the following case, using cloud functions to read what files cloud functions have in the cloud directory (that is, to list the files in the cloud function directory):

  1. const cloud = require('wx-server-sdk')
  2. const fs = require('fs')
  3. cloud.init({
  4. env: cloud.DYNAMIC_CURRENT_ENV
  5. })
  6. exports.main = async (event, context) => {
  7. const funFolder = '.';//.表示当前目录
  8. fs.readdir(funFolder, (err, files) => {
  9. files.forEach(file => {
  10. console.log(file);
  11. });
  12. });
  13. }

The fs.readdir() method is used above to read all the files of the cloud function below the directory on the server side in an asynchronous manner.

We need to note that the directory file of the cloud function only read permissions, no write permissions, we can not write the file to the cloud function directory file, can not modify or delete the files inside. However, each instance of a cloud function provides a 512MB of temporary disk space in the /tmp directory to handle the temporary file read and write requirements during the execution of a single cloud function, and we can use the cloud function to do things like file addition and deletion of /tmp, and these module knowledge is still useful.

Third, the use of cloud functions to operate temporary disk space

We can also combine our knowledge of Nodejs file operations by using cloud functions to create a txt file in /tmp temporary disk space and then upload the created file to cloud storage.

  1. const cloud = require('wx-server-sdk')
  2. const fs = require('fs')
  3. cloud.init({
  4. env: cloud.DYNAMIC_CURRENT_ENV
  5. })
  6. exports.main = async (event, context) => {
  7. //创建一个文件
  8. const text = "云开发技术训练营CloudBase Camp. ";
  9. await fs.writeFile("/tmp/tcb.txt", text, 'utf8', (err) => { //将文件写入到临时磁盘空间
  10. if (err) console.log(err);
  11. console.log("成功写入文件.");
  12. });
  13. //将创建的txt文件上传到云存储
  14. const fileStream = await fs.createReadStream('/tmp/tcb.txt')
  15. return await cloud.uploadFile({
  16. cloudPath: 'tcb.txt',
  17. fileContent: fileStream,
  18. })
  19. }

The file created above uses the fs.writeFile() method, which we can also use fs.createWriteStream() to handle:

  1. const writeStream = fs.createWriteStream("tcb.txt");
  2. writeStream.write("云开发技术训练营. ");
  3. writeStream.write("Tencent CloudBase.");
  4. writeStream.end();

Note that the directory we create the file is that the temporary disk space /tmp not the current directory of the cloud function . . . . the temporary disk space is independent of the cloud function and not under the cloud function directory.

Temporary disk space has 512M, readable and writeable, so we can do some file processing turnover during the execution phase of the cloud function, but this temporary disk space may be destroyed after the function is executed, and you should not rely on and assume that the temporary files stored in disk space will always exist.

Four, cloud functions and Buffer

The introduction of the Nodejs Buffer class gives cloud functions the ability to manipulate file streams or network binary streams, and the type of data that cloud functions download from cloud storage through the downloadFile interface is Buffer, and the updateFile interface uploads Buffer data to cloud storage. The Buffer class is in the global scope, so we don't need to introduce it with 'buffer'.

You can also use Buffer for encoding conversions, such as the following case where a picture download stored in the cloud (this data type is Buffer) is converted to base64 encoding via the toString() method of the buffer class and returned to the small terminal. Use the developer tool to create a new downloading cloud function, and then enter .js code in the index:

  1. const cloud = require('wx-server-sdk')
  2. cloud.init({
  3. env: cloud.DYNAMIC_CURRENT_ENV,
  4. })
  5. exports.main = async (event, context) => {
  6. const fileID = 'cloud://xly-xrlur.786c-xly-xrlur-1300446086/cloudbase/1576500614167-520.png'
  7. //换成你云存储内的一张图片的fileID,图片不能过大
  8. const res = await cloud.downloadFile({
  9. fileID: fileID,
  10. })
  11. const buffer = res.fileContent
  12. return buffer.toString('base64')
  13. }

Create an event handler getServerImg() on the small terminal to call the cloud function and assign the data returned by the cloud function (base64 encoded picture) to the img in the data object, such as entering the following code in the js file of a page:

  1. data: {
  2. img:""
  3. },
  4. getServerImg(){
  5. wx.cloud.callFunction({
  6. name: 'downloadimg',
  7. success: res => {
  8. console.log("云函数返回的数据",res.result)
  9. this.setData({
  10. img:res.result
  11. })
  12. },
  13. fail: err => {
  14. console.error('云函数调用失败:', err)
  15. }
  16. })
  17. }

Add an image component (note the address of the src) to the page's wxml file, and when you click button, an event handler is triggered to call the cloud function to render the base64 picture you get to the page.

  1. <button bindtap="getServerImg">点击渲染base64图片</button>
  2. <image width="400px" height="200px" src="data:image/jpeg;base64,{{img}}"></image>

Cloud function in the processing of pictures, the picture into base64 is a lot of restrictions, such as pictures can not be too large, return to the small program data size can not exceed 1M, and these pictures are preferably temporary files, it is generally recommended that you process the picture to cloud storage as a bridge, the picture processing after uploading to the cloud storage to get fileID, and then directly render this fileID on the small program.

Buffer can also and string String, JSON and other conversions, you can also handle ascii, utf8, utf16le, ucs2, binary, hex and other coding, can be copy copy, concat stitching, indexOf lookup, slice slice and so on, these can be applied to the cloud function, from one introduction, the specific content can read the official technical documents of Nodejs.

Through the cloud storage to carry out the transfer of large files from a cost point of view is also necessary, the cloud function to transfer files to the cloud storage is using intranet traffic, fast zero cost, small terminal to get cloud storage files go CDN, transmission effect is good, the cost is relatively low, about 0.18 yuan / GB;