FAQ
Hello,

I've encountered a strange experience. When I tried to stream media, I mean
huge media, through the network using node.js, the client side receives an
uneven stream so it's not quite smooth viewing.

Base on my research (Googling), some people say that node.js consumes a lot
of memory. Is this true?

The above problem is only 3-person network and just in the demo state. But
what about a network of 10,000 and more users?

Can node still handle this streaming? If not, what should I do to improve
my user experience?

Thanks in advance for understanding my concern.

--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nodejs@googlegroups.com
To unsubscribe from this group, send email to
nodejs+unsubscribe@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Search Discussions

  • Eric S at Nov 20, 2012 at 4:31 am

    On Monday, November 19, 2012 7:52:43 PM UTC-8, Ket wrote:
    Base on my research (Googling), some people say that node.js consumes a
    lot of memory. Is this true?

    It can be. In my limited experience, that has a LOT more to do with how
    you code your application than node itself, however. You need to make sure
    that you're efficiently streaming the media (not reading too far ahead of
    what's been sent, etc). You might find more information searching here for
    "back pressure" as I know I've seen that term used in this context.

    Basically, if you just read in the file as fast as you can and send it, you
    could potentially wind up buffering most of the file on the output side,
    since disk IO is usually much faster than network IO once you're past the
    LAN (and sometimes even within the LAN, particularly if the file is cached
    in RAM). Even on a LAN, the client speed might prove to be a bottleneck
    rather than the LAN, though that's more likely the case with mobile devices
    rather than modern desktop machines.

    A few questions concerning the content: Is it static, or live streaming?
    Is it a single stream, or multiple? And how huge is huge? Any of these
    could affect any recommendations.

    --
    Job Board: http://jobs.nodejs.org/
    Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
    You received this message because you are subscribed to the Google
    Groups "nodejs" group.
    To post to this group, send email to nodejs@googlegroups.com
    To unsubscribe from this group, send email to
    nodejs+unsubscribe@googlegroups.com
    For more options, visit this group at
    http://groups.google.com/group/nodejs?hl=en?hl=en
  • Ket at Nov 20, 2012 at 4:50 am
    Thanks,

    It's live streaming. Let's say cable TV on the internet. So you can
    imagine. It's huge.

    And thanks for mentioning mobile devices. I forgot about that. In the
    future, It would support mobile devices too.

    It's maybe multiple stream because I allow my users to share what they view
    too. This is to create a social network so they can invite friend and build
    larger user base. It's might be a bad idea but I have yet to see.

    My code is very basic just as minimal as getting thing works.

    So, what is the efficient way to manage RAM usage?

    Thanks
    On Tuesday, November 20, 2012 11:31:00 AM UTC+7, Eric S wrote:


    On Monday, November 19, 2012 7:52:43 PM UTC-8, Ket wrote:

    Base on my research (Googling), some people say that node.js consumes a
    lot of memory. Is this true?

    It can be. In my limited experience, that has a LOT more to do with how
    you code your application than node itself, however. You need to make sure
    that you're efficiently streaming the media (not reading too far ahead of
    what's been sent, etc). You might find more information searching here for
    "back pressure" as I know I've seen that term used in this context.

    Basically, if you just read in the file as fast as you can and send it,
    you could potentially wind up buffering most of the file on the output
    side, since disk IO is usually much faster than network IO once you're past
    the LAN (and sometimes even within the LAN, particularly if the file is
    cached in RAM). Even on a LAN, the client speed might prove to be a
    bottleneck rather than the LAN, though that's more likely the case with
    mobile devices rather than modern desktop machines.

    A few questions concerning the content: Is it static, or live streaming?
    Is it a single stream, or multiple? And how huge is huge? Any of these
    could affect any recommendations.
    --
    Job Board: http://jobs.nodejs.org/
    Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
    You received this message because you are subscribed to the Google
    Groups "nodejs" group.
    To post to this group, send email to nodejs@googlegroups.com
    To unsubscribe from this group, send email to
    nodejs+unsubscribe@googlegroups.com
    For more options, visit this group at
    http://groups.google.com/group/nodejs?hl=en?hl=en
  • Tim Caswell at Nov 20, 2012 at 1:31 pm
    Node uses "a lot" depending on what that means. Besides the actual amount
    of buffer data you app logic keeps in ram, node itself uses about 10Mb
    overhead for the bare process. Also the GC is optimized more for speed
    than memeory usage. It doesn't really start freeing till it feels memory
    pressure. As was mentioned in the memory leak thread recently, you can
    enable gc hooks and manualy pressiure it to collect before it hits the
    normal threshold.

    So if you're on a mobile device and need a dozen processes, node can seem
    like a memory hog. If you're on a 16GB server and only have a single node
    process, the overhead is insignificant. In that case all that matters is
    how your program handles it's buffers.
    On Nov 19, 2012 9:52 PM, "Ket" wrote:

    Hello,

    I've encountered a strange experience. When I tried to stream media, I
    mean huge media, through the network using node.js, the client side
    receives an uneven stream so it's not quite smooth viewing.

    Base on my research (Googling), some people say that node.js consumes a
    lot of memory. Is this true?

    The above problem is only 3-person network and just in the demo state. But
    what about a network of 10,000 and more users?

    Can node still handle this streaming? If not, what should I do to improve
    my user experience?

    Thanks in advance for understanding my concern.

    --
    Job Board: http://jobs.nodejs.org/
    Posting guidelines:
    https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
    You received this message because you are subscribed to the Google
    Groups "nodejs" group.
    To post to this group, send email to nodejs@googlegroups.com
    To unsubscribe from this group, send email to
    nodejs+unsubscribe@googlegroups.com
    For more options, visit this group at
    http://groups.google.com/group/nodejs?hl=en?hl=en
    --
    Job Board: http://jobs.nodejs.org/
    Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
    You received this message because you are subscribed to the Google
    Groups "nodejs" group.
    To post to this group, send email to nodejs@googlegroups.com
    To unsubscribe from this group, send email to
    nodejs+unsubscribe@googlegroups.com
    For more options, visit this group at
    http://groups.google.com/group/nodejs?hl=en?hl=en
  • Jeff Barczewski at Nov 20, 2012 at 6:31 pm
    Node.js, which is an evented server, is more efficient at handling multiple
    concurrent connections than a multi-threaded or multi-process server. There
    would be more overhead incurred for each and every thread or process using
    other technologies than with an evented server like Node.js.

    Node.js which uses Chrome's V8 javascript engine, by default is only
    configured to use about 1GB of memory, but I believe that limit can be
    increased in recent versions of node.js which should have the r9823 chrome
    fix which allows the limit to be increased beyond 2GB on 64-bit server (see
    the issue for details on how to give V8 more memory).
    http://code.google.com/p/v8/issues/detail?id=847 However you might want to
    consider running a cluster of nodes on your server before increasing the
    memory, see discussion below.


    As to why your streaming might be slow, there are many things to consider:


    - Have you pre-compressed the video so that you are not serving more data
    than necessary? Choose a good codec to keep as much quality as possible for
    the amount of data to server (webM, h.264, etc.). The more compression you
    use, the lower the load on your servers, bandwidth, etc.
    - Are you properly serving streams of the data and not holding too much in
    memory? (ie. not reading in all the data but rather reading as a stream)
    - Are the users receiving the same data (live stream) such that you can
    read once and send to all connected users? (not re-reading same data for
    each connection)
    - Do you have sufficient bandwidth between your server and users? also
    between media storage and server?
    - Have you matched the size of data sending back to the size your devices
    can handle? The more mismatch in the chunk size, the more buffering the
    server will be doing and thus the more memory it will be using. Is the back
    pressure being applied all the way back up the chain in your code so that
    you are pausing reads while waiting for data to be consumed.
    - If you are going to be scaling up to lots of users that are not
    receiving the same stream, then you might want to consider using a node
    cluster running multiple node instances (managed by cluster) on a
    multi-process server with lots of memory. That way the load is distributed
    over many processes each using its own heap of memory.
    - If you outgrow a node cluster on a single server, then go to a load
    balanced multi-server arrangement and you can continue to scale up assuming
    you are not bottlenecked elsewhere (bandwidth, media server, db, etc.)


    So those are some of the things to consider when trying to build a scalable
    streaming media node.js server.

    If you consider all of these when building your system, node.js can be a
    very scalable system. Properly architected and coded, it should use less
    memory than a non-evented architecture, especially if you are serving the
    same stream to multiple clients (live stream) then it becomes even more
    efficient since you can read once and send the same data to all.

    Companies like Voxer and Transloadit which use node.js to build massively
    scalable systems, so they are good examples of what can be done given the
    right architecture.



    --
    Job Board: http://jobs.nodejs.org/
    Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
    You received this message because you are subscribed to the Google
    Groups "nodejs" group.
    To post to this group, send email to nodejs@googlegroups.com
    To unsubscribe from this group, send email to
    nodejs+unsubscribe@googlegroups.com
    For more options, visit this group at
    http://groups.google.com/group/nodejs?hl=en?hl=en

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupnodejs @
categoriesnodejs
postedNov 20, '12 at 3:52a
activeNov 20, '12 at 6:31p
posts5
users4
websitenodejs.org
irc#node.js

People

Translate

site design / logo © 2022 Grokbase