Obnoxious noises in Apple Mail - how to silence?

Discussion in 'Apple' started by Don Bruder, Apr 28, 2014.

  1. Don Bruder

    billy Guest

    It's easily the best, and best supported, OS X user agent.
    Plus, the guy was 100% clear about where the money is going.
    Not to mention it was an already existing program people
    could evaluate before tossing any money at it.

    Where's the rip in that?
    Can time-limited trials of the "full" version be offered? No.
    You may blame Apple for that.

    Billy Y..
     
    billy, May 8, 2014
    1. Advertisements

  2. Don Bruder

    billy Guest

    Nope. Usenet will cost more. As you'd use it, a lot more. Example,
    since you mentioned Youtube -

    https://peering.google.com/about/ggc.html

    | Google Global Cache

    | We aim to deliver Google content and services as close to users as
    | possible in order to provide the best possible performance and lowest
    | cost for network operators to serve traffic demanded by their subscribers.

    | With GGC, network operators and Internet Service Providers deploy a
    | small number of Google servers inside their network to serve popular
    | Google content, including YouTube.

    Billy Y..
     
    billy, May 9, 2014
    1. Advertisements

  3. Don Bruder

    Lewis Guest

    1) Almost all developers make more from the MAS because they get more
    exposure. The exception is companies that already have well established
    products, and even there... Barebones dropped the price of BBEdit by
    2/3 and seems to be quite happy offering MAS and non-MAS versions.

    2) you can always make a demo available outside the MAS.
     
    Lewis, May 9, 2014
  4. Don Bruder

    Lewis Guest

    In message <lkei0r$21a$>
    That's simply not true. Developers are free to have demo version if they
    want. There's a whole big web out there and there is no exclusivity with
    the MAS.
     
    Lewis, May 9, 2014
  5. Don Bruder

    Guest Guest

    unless they can't, because their app won't work outside of the mac app
    store, such as if it uses icloud.

    regardless, requiring the user to go hunt down a demo version somewhere
    else is ridiculous.

    the lack of demo versions also exists for the ios app store and there
    is no alternate method even if the developer wanted to.
     
    Guest, May 9, 2014
  6. Don Bruder

    billy Guest

    Tin (and presumably any other newsreader) can show anything and
    everything that exists on any particular news server. Oh - you
    want photos, or video? Fire up a web browser, video player, etc.

    Tin can also save this junk to a file, for those who can't live
    without it. Then, there are probably some web hosts that decode
    all this stuff and offer it - an old friend had one (News2Me.COM)
    up for a while, but I see it's no longer there. Just a guess,
    but I'd suspect it was croaked by the DMCA.
    Why? When traffic is continually increasing, "slide into obscurity"
    does not follow...
    ....even when it's unwanted, abusive crap. People _are_ using it.
    You should find that informative -- the audience you're seeking to
    serve prefers web sites. Not to mention I personally would greatly
    prefer they use them.

    Usenet is doing quite well where access restrictions effectively reduce
    the crap to a tolerable level. The Panix hierarchy (local groups only
    available to Panix customers) is an excellent example.
    I think you're confusing & conflating the quality of the tools with the
    quality of the users. A "better" newsreader is not going to suddenly
    transform an idiot into a polite, well-spoken human being.
    Well, pushing those costs onto third parties is not my idea of a
    solution. If something fails, I think that's because it wasn't good
    enough to carry its own weight. Usenet is not a charity...
    Again, it's not the software, it's the content.

    Why would we (maintainers of Usenet) pursue anything that's only going
    going to pile yet more inappropiate load on a system that is not meant
    to support any of it?

    By the way... The Panix reader servers only carry (at this moment) 9550
    text groups. There may be a few binary groups in these, but I doubt it.
    Huh? I'm a very heavy user of email for commercial purposes. I pay
    for it. How much do you think 15TB a day of email would cost?

    Email is a mission-critical service, and is handled accordingly.
    Usenet is not. Usenet is a best-effort service.
    Anyone can generate income with a web site. Usenet is not a
    commercial service. The charters of most groups forbid, or at
    least discourage, commercial use.
    Again, Usenet is not a commercial service. Further, I suspect
    the massive load created by traffic Usenet was never intended to
    support is a significant factor in many ISPs deciding to just
    stop providing it, at all.
    What exactly is wrong with using HTTP to serve HTTP content?

    Is it that it costs too much...
    ....so it seems.

    Well, I don't want to pay for it, either.
    I don't see any mutual exclusivity here.
    She should use a web browser.
    Most can't build it, true, but it does compile fine on Mac OS. I
    suppose I could build and offer the binaries to those who'd like to
    use it. If someone hasn't already (I haven't looked). I could also
    easily add a script to invoke the terminal emulator and run it.
    Well, yea, of course not.
    Here you want us to be charitable, but elsewhere you want us to buy
    (or at least fund the development of) your "open source." Nobody is
    paid anything for their work on tin, or slrn. Nothing. Not one cent.
    And this will execute under how many different operating systems?

    Tin builds and runs on plenty of them - http://www.tin.org/builds.html
    Again, I must say, geez. Feel free to fire up your prefered search
    and replace tool and regex your way to happiness.
    You can't call it open source if you want to sell it.
    Why not? It still works perfectly.
    Well, after you've patched art, have a whack at bitmap.
    Tin is not going to cater to how people _ABUSE_ Usenet. Not in 2014,
    not ever. I seriously doubt slrn ever will, too.

    I think you need some experience maintaining NNTP and NNRP servers.
    I'm very happy with it as it is. I _don't want_ photos, video, et cetera.
    Just say it with words, and perhaps a cite or two, and I'll be happy.
    This is what HTTP does so well.
    So, use a non-group-centric protocol.
    Yes, it is. Does that justify your piling yet more load on a service
    never intended to support (any of) it?
    No - most servers expire stuff, and in particular, expire the material
    you'd like to clog them with quite rapidly. There are some archives,
    but for the the text groups.
    No, the cost is essentially the same. Just because no one is forcing
    you to pay for it does not mean no one at all has to cover the costs.

    Somebody is absolutely, positively paying for it.
    DejaNews. R.I.P.

    There are some people who save things, ala olduse.net - but are they
    going to want to archive massive amounts of your data? I think not.
    Again, who is going to pay for all this?
    We don't care. We're happy with what we have, and don't like freeloading
    interlopers piling on the work, and expenses, of maintaining the system.
    Huh? All it takes is a quick, and shallow, perusal of the sources.
    Huh? You want to sell your own code, but you want us to write it for free?
    Speaking of unprofessional.
    I merely suggested you give it a look, and benefit from their decades of
    experience creating a newsreader that works well in the real world.
    How does the existence a transportable library relate to getting the
    IETF, et al., to agree to support your intended uses of Usenet?
    Again, you can avoid a lot of that by simply observing what others have
    done to fix or work around them.
    I don't see how you can say that - you don't know any of them, nor how
    they work, nor what they want to accomplish. Speaking of disingenuous.

    Billy Y..
     
    billy, May 9, 2014
  7. Don Bruder

    Doc O'Leary Guest

    Already answered. My corporate policy has always been software as a
    service, and things get open sourced once development costs have been
    covered. You, or anyone, is welcome to cover those costs individually
    or collectively via crowd funding, or you can default to the model of
    waiting until there is a product to buy.

    But, wait, there is a line at the end of the tunnel! I'm far enough
    along that I want to do MIME handling, and in searching around for that
    I *finally* came across a library for NNTP that looks reasonably well
    done. MailCore (libmailcore.com) is a Cocoa wrapper for LibEtPan
    (etpan.org), which itself supports not only IMAP/POP/SMTP email, but
    NNTP and RSS/Atom feeds. It's something I wish I would have found 2
    months ago.

    That said, I'm not sure I'm going to switch my NNTP framework to
    wrapping it. It is at a lower level than is useful for a modern
    newsreader (e.g., hardcoded things like XOVER rather than CAPABILITIES
    detection, etc.), but it appears to be a great starting point for anyone
    who wants/needs something C-based. Far, far better than having do dig
    out something useful from the likes of tin or slrn.
     
    Doc O'Leary, May 9, 2014
  8. Don Bruder

    Doc O'Leary Guest

    Like I said, *of course* people have been pushing their proprietary CDNs
    to do what Usenet could have been doing all along. If I were an ISP,
    I'd rather have one generic CDN protocol on servers that I control
    rather than having hundreds of content providers trying to co-locate
    their own black box hardware inside my operation. Usenet can benefit
    *everyone*, not just the big players like Google.
     
    Doc O'Leary, May 9, 2014
  9. Don Bruder

    Doc O'Leary Guest

    So it's just not that interesting. Pre-orders have been around for
    ages. My ideal in crowdfunding is to offer people something they don't
    usually get from a company. I mean, if all the costs are covered before
    work even begins, it just doesn't seem right to *not* give away the
    software for free, and possibly go the extra mile of open sourcing the
    efforts. I mean, in any other contract work for hire that I do, I don't
    get to keep the code, so I'd have to be a pretty big dick to treat
    *people* worse than I treat *companies*.
    It's a general problem with crowdfunding, not that particular project.
    When the costs are covered ahead of time, there is no risk taken on by
    the project. A decent person/company would offer something in return
    for that, not just a $50 value for your $50 investment. Possibly,
    maybe, that is, unless the project fails to deliver or was an outright
    scam from the beginning.
    Again, not true. Anyone who could bother to support switches for in-app
    purchases and/or subscriptions can just as easily turn things on/off
    based on time. There are *thousands* of games that ship with a
    "freemium" model that starts off with all gauges on MAX and then slowly
    degrade based on usage. All evidence is against your position.
     
    Doc O'Leary, May 9, 2014
  10. Don Bruder

    billy Guest

    Anyone can post to Youtube.
    But, you're continually ignoring the fact that Usenet was not created
    nor intended to do this.
    Where do you get hundreds?
    "My corporate policy has always been software as a service..."

    Again, it all comes down to who's paying for it. In addition to some
    Usenet admin experience, you'd benefit from a bit of ISP experience as
    well.
    _Something_ may benefit everyone. Usenet is not it. You're putting
    the cart before the horse. It'd be best to first find a horse.

    Billy Y..
     
    billy, May 9, 2014
  11. Don Bruder

    billy Guest

    In this respect, MailMate easily qualifies.
    And you're offering ... what exactly?
    You might want to read Apple's policy (requires registration to view) -

    https://developer.apple.com/appstore/mac/resources/approval/guidelines.html

    Billy Y..
     
    billy, May 9, 2014
  12. Don Bruder

    JF Mezei Guest

    The demo version are often called "Lite". They are fere, give you a
    taste of the full version but have a number of features disabled.
     
    JF Mezei, May 9, 2014
  13. Don Bruder

    Guest Guest

    sometimes they do that and sometimes they're time-limited, or with
    games, level-limited.

    apps that aren't in the app store can do the same thing.
     
    Guest, May 10, 2014
  14. Don Bruder

    Your Name Guest

    Yep. Unforuntaely there's no real "standard" way for naming apps. Some
    developers reverse that by using a normal name for the free / cheaper
    version, and then tacking "Pro" on the end of the more expensive
    version.
     
    Your Name, May 10, 2014
  15. Don Bruder

    Ant Guest

    Ditto!
    --
    "The world flatters the elephant and tramples on the ant." --Indian
    /\___/\ Ant(Dude) @ http://antfarm.ma.cx (Personal Web Site)
    / /\ /\ \ Ant's Quality Foraged Links: http://aqfl.net
    | |o o| |
    \ _ / If crediting, then use Ant nickname and AQFL URL/link.
    ( ) If e-mailing, then axe ANT from its address if needed.
    Ant is currently not listening to any songs on this computer.
     
    Ant, May 10, 2014
  16. Don Bruder

    billy Guest

    Again, it all comes down to who's paying for it. In addition to some
    So -- would you rather have this...

    http://preview.tinyurl.com/njxvlaq -or-
    http://arstechnica.com/information-...opping-packets-every-day-over-money-disputes/

    | Level 3 claims six ISPs dropping packets every day over money disputes
    |
    | Network provider doesn't name and shame ISPs guilty of "permanent
    | congestion."
    |
    | by Jon Brodkin - May 5 2014, 10:25am MST
    |
    | Network operator Level 3, which has asked the FCC to protect it from
    | "arbitrary access charges" that ISPs want in exchange for accepting
    | Internet traffic, today claimed that six consumer broadband providers
    | have allowed a state of "permanent congestion" by refusing to upgrade
    | peering connections for the past year.
    |
    | Level 3 and Cogent, another network operator, have been involved in
    | disputes with ISPs over whether they should pay for the right to send
    | them traffic. ISPs have demanded payment in exchange for accepting
    | streaming video and other data that is passed from the network providers
    | to ISPs and eventually to consumers.
    |
    | When the interconnections aren't upgraded, it can lead to congestion and
    | dropped packets, as we wrote previously regarding a dispute between
    | Cogent and Verizon. In a blog post today, Level 3 VP Mark Taylor wrote:
    |
    | A port that is on average utilized at 90 percent will be saturated,
    | dropping packets, for several hours a day. We have congested ports
    | saturated to those levels with 12 of our 51 peers. Six of those 12
    | have a single congested port, and we are both (Level 3 and our peer)
    | in the process of making upgrades--this is business as usual and happens
    | occasionally as traffic swings around the Internet as customers change
    | providers.
    |
    | That leaves the remaining six peers with congestion on almost all of
    | the interconnect ports between us. Congestion that is permanent, has
    | been in place for well over a year and where our peer refuses to augment
    | capacity. They are deliberately harming the service they deliver to
    | their paying customers. They are not allowing us to fulfill the requests
    | their customers make for content.
    |
    | Five of those congested peers are in the United States and one is in
    | Europe. There are none in any other part of the world. All six are large
    | Broadband consumer networks with a dominant or exclusive market share in
    | their local market. In countries or markets where consumers have
    | multiple Broadband choices (like the UK) there are no congested peers.
    |
    | [...]

    ....or a few CDNs co-locate their own servers at your ISP?

    Billy Y..
     
    billy, May 10, 2014
  17. Don Bruder

    Doc O'Leary Guest

    Your argument is not consistent. If you call the bulk of traffic
    "junk", you don't get to hold it up as a success story. The landscape
    is much more complex than that, as we all know. Your flawed logic
    aside, the fact remains that text discussions on Usenet peaked long ago.
    The decline is so bad that, hell, I can't even find detailed statistics
    anymore that could either support or refute me!
    Right . . . using it for things that tin does *not* support very well.
    I disagree. Those same masses have flocked to mobile apps as well.
    What they want clearly is not web-centric, so the rational mind would
    look to what features are offered by other services that are not
    commonly offered by Usenet clients.
    That's your straw man, not mine. My only desire is to increase the
    *number* of people on Usenet. A better client *will also* go a long way
    to attracting people who are discerning but don't have the technical
    skills/desire to run command line tools. That includes a feature not
    offered by most web sites: the ability to filter out the morons.
    It's not "pushing" costs, it's simply more efficient allocation of
    resources. It is *stupid* to have every leaf node individually go to a
    central server to get the same content over and over. It's better for
    everyone to pull from a local cache instead. Again, we could argue
    about whether or not NNTP is the best way to build such a CDN, but
    that's another discussion.
    For most people, there is NO difference. Most people don't know HTTP
    from NNTP from XMPP from RSS/XML from EIEIO. All they know is that they
    want to *do* something, and some software makes it easy where other
    software makes it hard.
    Wrong perspective. You're in a *service* industry, so do your fucking
    job and start better serving the needs of the people. If you don't,
    then you should not be at all surprised when they flock away to services
    that do. Nor should you be surprised when even *more* "inappropriate"
    traffic takes over the network. It is *you* who is to blame for the
    neighborhood going to hell.
    Great, for those that are part of that special in-crowd. But there is a
    large continuum between that and Penny Arcade's "Shitcock".
    Reread your own post. You said "directly, at least". I don't know
    about you, but *I* certainly don't have anyone sending me money directly
    in email (or Usenet). They are a loss leader, lumped in with other
    support and/or communication cost that are a part of doing business.

    So the problem with Usenet binaries is not that there are 15TB daily,
    but that there *may* be wasteful overhead on the distribution of that
    content. That problem, if it exists, is something that could easily be
    addressed if the "maintainers of Usenet" really cared to.
    It's all very fluid. Some people now consider Twitter or Facebook to be
    more mission-critical. Or web SEO. Hell, there are a ton of web sites
    that would rather have to use a contact form than send them an email.
    Again, once you stop cherry picking, the evidence is actually against
    your position.
    Then it's a shame that new groups can *never* be created . . .
    Please. As I have noted and linked to, the changes are mainly due to
    "questionable" content more than the bandwidth or disk space used. I
    burn through more bandwidth these days streaming video
    (Hulu/Netflix/YouTube/etc.) than I ever did (or ever will) with Usenet.
    There is no such thing as HTTP content. Like NNTP, HTTP is merely a
    protocol that can be used to *deliver* content. Now, you could
    certainly argue that it does a better job for delivering binary content
    than Usenet, but it's all debatable depending on the needs you have.
    Given all the "cloud" BS that get thrown around, you'd think people
    might take a closer look at how NNTP might better serve their needs.
    Again, it's that it is limited to favor what it was designed to do,
    which might not be what people *really* want to do. I don't *want* to
    shove news into a big XML file that an RSS reader has to poll regularly
    and download entirely over HTTP, and then jump through hoops to track
    what is or isn't "new". But somehow that became preferable to just
    making a post to Usenet, despite the obviously increased overhead and
    costs.
    Tough. Everybody pays, and the cost is usually *higher* when you can't
    account how you're paying for something.
    It's not, but the rational thing to do is follow best practices wherever
    possible.
    Point to me a good web-based Usenet client. Must support subscriptions
    and filtering.
    I hope you're being obtuse on purpose. It's not that it won't *build*
    on a Mac, it's that a Mac user expects a Mac app to run in some place
    other than Terminal. They want WIMP, and you know it.
    No! Again, it's only "of course not" because the developers are stuck
    with some crappy ol' code they can't be bothered to modernize. The fact
    that tin is increasingly brittle is not something that should be
    acceptable.
    They are, just not in any way that's been accounted for. I mean, I
    absolutely *could* release what I'm doing for free, too, but not without
    having some way to otherwise pay my bills. I respect people enough to
    present them with the reality of economics, not hide it behind a day job
    or advertising or whatever other ways "free" things get paid for.
    All the ones I'm being paid to get it running on. :)
    Great for people who still use tin on fringe systems. You sound a lot
    like the OpenSSL goons who couldn't bear to excise similar crap from
    their code. Hell, I'm not even going to bother testing what I'm doing
    on Macs prior to 10.9. Not unless somebody is going to cover supporting
    older systems, of course.
    Again, I point to it as a symptom of a project's underlying disease.
    There are *scores* of ugliness that I'd have to undo to make the code
    reasonable. And if *they* cared, they'd already have taken steps to
    make it better. Since that's not the case, they have self-selected for
    a poor developer community, and should not be surprised when I don't
    want to be a part of that.
    You're not lying to me.
    Already done. Like I said previously, I based it on NSScanner and
    NSIndexSet. Blazingly fast compared to MT-NW on a 2MB newsrc I tested
    with; can't say how it'd compare to tin. The code is *way* more
    readable, though. Perhaps you'll see it someday.
    Look, we get it! Anything that makes tin still relevant is the True
    Scotsman of Usenet usage, and anything that makes tin look bad is
    horrible abuse. Bury that dead horse already.
    I'd rather update the protocol to something more in line with how the
    CDN is being used.
    Good for you, old man. But the world has moved on, what with the kids
    and their Twattering and FaceBorking. Again, it's all a question of
    whether or not what you've built in the past can be built on in the
    future. If not, say goodnight Gracie.
    No, it doesn't. I'm not going to go too far down another protocol
    branch in this thread, but it will be clear to anyone who has ever tried
    to get structured content off of a website knows that there is *nothing*
    it brings to game for standardizing contact information.
    Yes. The proper perspective is to *fix* the service to support its
    actual utility. Clearly you'd prefer to fossilize.
    What "most" servers do is not the issue. As long as there is still a
    market for *any* service with long retention times, you have a means to
    that end.
    Again, you've got the economics all wrong. A shared cost is not
    necessarily a *higher* cost; it is more often just the opposite.
    I am. And you are. As are thousands of others. Because we're all
    pooling our resources, we exploit economies of scale and things are
    cheaper for us. Compare that to the crappy double-dipping that
    companies are screwing around with and it should be clear that you're
    completely wrong on these issues.
    You support my point. My file is on *Usenet*, not DejaNews.
    They do. You can argue that maybe they shouldn't, or that they might
    not in the future. But that *is* the state of things today for Usenet,
    which is far more than can be said for all the newcomer cloud services.
    We all are. It's not a problem, because it's a better way to do this
    than the alternatives.
    As someone who has bothered to go farther than that myself, you're full
    of it. I already gave a simple example: I want to use tin as the basis
    for nntp support in cURL. Show me the quick, shallow hooks that can
    make it happen.
    Straw man again. I never said you should be doing anything for free. I
    merely said it is unprofessional to be a "brogrammer".
    And I told you what I saw. Either you can take a step back and look at
    it from a neutral perspective, or you can continue to bury your head in
    the sand and think that everything is just perfect. It's all to clear
    which choice you're making.
    It doesn't. That was just another straw man *you* threw in there as a
    distraction. My point was that tin and slrn probably *do* adhere to the
    standards much better at their place in the SDLC than any new
    implementation does, and that is why it is a shame that what has been
    done by them cannot be reused.
    Again, not if those fixes are buried deep in spaghetti code.
    I know them by how they interact with other people, and how they
    approach criticism. For example, one person who claimed to be a core
    FreeBSD developer flat-out stated that they didn't care about anyone who
    didn't submit code to the project. What a shitty way to be dismissive
    of people who could otherwise help! That's a toxic environment, and I
    will not be a part of a community that is like that.
     
    Doc O'Leary, May 10, 2014
  18. Don Bruder

    Doc O'Leary Guest

    How? It certainly doesn't let me.
    I'm not ignoring it. It's irrelevant given the realities of what Usenet
    *is*. The proper approach is to make it stronger for how it *is* used,
    not moan and cry about how it's not being used right.
    I use my brain to think about the future. Try it. It's pretty cool!
    If someone wants to pay me to solve their problems, great. That's kinda
    what I do. If they instead just want to complain about their problem
    and expect free sympathy . . . get an intern, maybe?
    The horse was there long before the cart, but the ISPs killed it. As
    you well know, even without them, the traffic keeps going up . . .
     
    Doc O'Leary, May 10, 2014
  19. Don Bruder

    billy Guest

    Perhaps you'd like to explicitly state what your problem there is?
    As I've said, you'll need to take this up with the IETF. Example -

    http://tools.ietf.org/html/draft-ietf-usefor-useage-01

    And a bit of helpful guidance -

    http://www.rfc-editor.org/rfcfaq.html

    If you don't resolve this first, propagation of your desired content will
    suck, big time. Arts can, among other parameters, be rejected by size, and
    each server can be set to a specific limit independently.

    And, even if you do succeed in revising the protocol, that's no guarantee
    anyone will impliment it. "Make it stronger" involves a significant amount
    of money - the 14TB+ of daily crap isn't going away, and you want to pile
    yet more - who will be paying for that? The same people paying for the
    current load, you say? Nope - a good number of them are already ignoring
    it.

    Thus, there's the very real possibility your content will wind up on a mere
    handful of hosts, like Altopia. If it can get there in the first place.

    Since you appear not enamored with Google (centralized control of data),
    this cold, hard fact should give you pause...
    You and your brain should take a good, hard look at all the consolidation
    in vendors, including CDNs, going on right now. Speaking of (not) cool.

    http://www.bizety.com/2014/04/07/tough-road-ahead-limelight-networks/
    Nobody's going to pay you to create yet more problems.

    And - hard as it is to believe - I'm talking about who pays for the
    additional resources you want to eat. How could you possibly ignore
    that, to say nothing of confusing it with what you might get paid?

    I did a bit of looking around trying to answer my own question, and
    found, for example, this illuminating bit of info -

    http://blade.nagaokaut.ac.jp/cgi-bin/scat.rb/ruby/ruby-talk/26555

    | Subject: [ruby-talk:26555] Re: Selector Namespaces: A Standard Feature
    | for Smalltalk?
    | From: "David Simmons" <david.simmons smallscript.net>
    | Date: Tue, 27 Nov 2001 06:06:56 +0900
    |
    | Doc O'Leary,
    |
    | I've read your post/response and I don't know how to respond. You seem to
    | think that "versioning" is the solution to all the ills when I've pointed
    | out issues that versioning simply doesn't address. And my posts were merely
    | augmenting/clarifying items relating the JavaScript 2.0 references cited in
    | the original post on selector-namespaces.
    |
    | I've also made it abundantly clear that versioning is already a supported
    | language option and that the PRAGMATIC issue is that in REAL situations of
    | just-in-time deployment it NOT always an option. Why are you ignoring the
    | realities of software deployment and telling me what I already know about
    | the ideals of versioning.
    |
    | You also prattle on about namespaces with non-specific "dogma", yet you
    | could not have had experience with "selector namespaces" which ARE NOT just
    | "namespaces" as we see in other systems. They are a NEW Unifying Feature for
    | Dynamic Languages that provide a more general solution to what SOME static
    | languages have provided in the form of "public", "private", "protected".
    |
    | They are not meant to be a SUBSTITUTE or REPLACEMENT for versioning [rather
    | they augment it], they are designed to address REAL issues [which one can
    | demonstrate "versioning" does not adequately address; as per one (or more)
    | of my posts] in the area of effective/reliable deployment in 3rd party late
    | integration scenarios.
    |
    | -- Dave S. [www.smallscript.org]

    Gosh. At this point I'm feeling quite like Mr Simmons did 13+ years ago.
    The fact that the amount of abuse has grown to be in the vicinity of the
    load you'd like to dump on Usenet does not in any way whatsoever justify
    it, nor does it render it anything other the abuse it actually is, period.

    To be 100% clear, "it" immediately above includes your intended traffic.

    If you don't want to be an abuser, you must first get the protocol
    revised to include your intended use of it.
    Why should they devote any of their time, money, and resources to
    handling daily terabyte loads of abusive, inappropiate content?

    Yea, some of them filter out most of the junk, but others, lacking
    the modest expertise required to even do that, just throw out the
    baby with the bath water.
    And this justifies nothing, at all.

    Billy Y..
     
    billy, May 10, 2014
  20. Don Bruder

    Doc O'Leary Guest

    I'm not sure how. You can either pay them directly for a $50 license or
    go through crowdfunding to pay $50 for a license. What am I missing?
    Nothing. I'm not running a crowdfunding campaign. But, if someone
    wanted to run one for me, I would do just as I've said: consider it a
    work for hire, put out the app for free to everyone, and make the source
    code available.
    I've read it many times. You might want to state exactly where you
    think thousands of approved games have gone wrong.
     
    Doc O'Leary, May 10, 2014
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.