未加星标

How can I speedup node.js File operations?

字体大小 | |
[前端(javascript) 所属分类 前端(javascript) | 发布者 店小二05 | 时间 2018 | 作者 红领巾 ] 0人收藏点击收藏

In my project, a client will request to download a file from server with an id. I have to perform the following operations:

Validate the id from mongoDb Check extension Check File exists or not Read the file and write content to response

I am using the following code for checking file and sending the response.

fs.exists(filename, function(exists) { if (!exists) { res.writeHead(404, '', { "Content-Type" : "text/plain" }) res.write("404 Not Found\n"); res.end(); return; } fs.readFile(filename, "binary", function(err, file) { if (err) { res.writeHead(500, '', { "Content-Type" : "text/plain" }) res.write(err + "\n"); res.end(); return; } res.setHeader("Pragma", "public"); res.setHeader("Cache-Control: private, max-age=3600"); res.setHeader("Transfer-Encoding: chunked"); res.setHeader("Range: chunked"); res.writeHead(200, '', { "Content-Type" : contentType }); res.write(file, "binary"); res.end(file, "binary"); }); });

Within a few milliseconds, the client will request hundreds of files. The supporting file types are image,audio or video.

When there are lots of files in the folder, node.js is taking too much time to download the file. How can I improve the performance?

Problem courtesy of: Damodaran

Solution

I would recommend a few things.

You should not be using 'binary' . Don't give an encoding at all. By adding the encoding, you are making node do a ton of extra work to convert the file's Buffer object into a binary-encoded string . When you call write again with 'binary' that means node has to then do that same operation in reverse. Also you are passing the file to both end and write , meaning every file you download will contain the file data twice.

I'd recommend against using readFile . Since readFile passes the whole file contents back to you in your file variable, you are requiring node to load the whole contents of the file into RAM, meaning it needs to allocate a ton of buffers and then concatenate them, which is unneeded work.

There is no reason to use exists separately, because if you try to open a file that does not exist, the error will tell you, so checking first is just extra work.

Also, the Transfer-encoding header will be set all by itself, you don't need to do it.

Something like this should be faster:

fs.createReadStream(filename) .on('error', function(err){ if (err.code === 'ENOENT'){ res.writeHead(404, { 'Content-type': 'text/plain' }); res.end('404 Not Found\n'); } else { res.writeHead(500, { 'Content-type': 'text/plain' }); res.end(err + '\n'); } }) .on('open', function(){ res.writeHead(200, { 'Pragma': 'public', 'Cache-Control': 'private, max-age=3600', 'Content-type': contentType }); }) .pipe(res);

Solution courtesy of: loganfsmyth

本文前端(javascript)相关术语:javascript是什么意思 javascript下载 javascript权威指南 javascript基础教程 javascript 正则表达式 javascript设计模式 javascript高级程序设计 精通javascript javascript教程

tags: res,file,Content,writeHead,err,binary,node,end,will,write
分页:12
转载请注明
本文标题:How can I speedup node.js File operations?
本站链接:https://www.codesec.net/view/604927.html


1.凡CodeSecTeam转载的文章,均出自其它媒体或其他官网介绍,目的在于传递更多的信息,并不代表本站赞同其观点和其真实性负责;
2.转载的文章仅代表原创作者观点,与本站无关。其原创性以及文中陈述文字和内容未经本站证实,本站对该文以及其中全部或者部分内容、文字的真实性、完整性、及时性,不作出任何保证或承若;
3.如本站转载稿涉及版权等问题,请作者及时联系本站,我们会及时处理。
登录后可拥有收藏文章、关注作者等权限...
技术大类 技术大类 | 前端(javascript) | 评论(0) | 阅读(39)