Jump to content

Uplift - Still a good idea in 2021?


Extrude Ragu
 Share

You are about to reply to a thread that has been inactive for 879 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

It's important to understand that Amazon is not taking a moral stand. They are closing down Parler because they could become legally liable. This is about protecting their own hide. The laws in the US require platforms make a good faith effort towards moderation or be liable for user content, and a key part of Parler's image was that they did not moderate and they gave voices to people moderated off other platforms. They aren't making that 'good faith effort.' 

You can find terrible things on Facebook, YouTube, Reddit, and yes, SL but what all of those platforms have in common is a terms of service that they make a 'good faith' effort to uphold. If you report something, it might be removed. Parler wasn't removed from AWS until they refused to implement a moderation plan to remain on the Apple app store. 

The actual effectiveness of any of those platforms moderations is sort of beside the point. What matters from a liability standpoint is the effort is made.

  • Like 2
  • Haha 1
Link to comment
Share on other sites

29 minutes ago, Bitterthorn said:

It's important to understand that Amazon is not taking a moral stand. They are closing down Parler because they could become legally liable. This is about protecting their own hide. The laws in the US require platforms make a good faith effort towards moderation or be liable for user content, and a key part of Parler's image was that they did not moderate and they gave voices to people moderated off other platforms. They aren't making that 'good faith effort.' 

You can find terrible things on Facebook, YouTube, Reddit, and yes, SL but what all of those platforms have in common is a terms of service that they make a 'good faith' effort to uphold. If you report something, it might be removed. Parler wasn't removed from AWS until they refused to implement a moderation plan to remain on the Apple app store. 

The actual effectiveness of any of those platforms moderations is sort of beside the point. What matters from a liability standpoint is the effort is made.

Good point! And with Section 230 under attack, social media services have to be very careful about what is represented on their platforms.

Link to comment
Share on other sites

10 minutes ago, Blush Bravin said:

Good point! And with Section 230 under attack, social media services have to be very careful about what is represented on their platforms.

I had never heard of Section 230 and just read a bit briefly.  What I find ironic is the timeline it came about which is just about Amazon's beginning timeline.  I imagine all those "used books" going through Amazon were not coloring books to say the least and as we American's say "if you get my drift" as to what I'm saying here.  All that goes through SL with it's possible billion and one items it could generate in a short time are impossible to "police".  I have a life of my own and I cannot be SL's unpaid police nor do I want to get involved because as I said, I have a life of my own to lead.  

Edited by JanuarySwan
  • Like 1
Link to comment
Share on other sites

1 minute ago, JanuarySwan said:

I had never heard of Section 230 and just read a bit briefly.  What I find ironic is the timeline it came about which is just about Amazon's beginning timeline.  I Imagine all those "used books" going through Amazon were not coloring books to say the least and as we American's say "if you get my drift" as to what I'm saying here.  All that goes through SL with it's possible billion and one items it could generate in a short time are impossible to "police".  I have a life of my own and I cannot be SL's unpaid police nor do I want to get involved because as I said, I have a life of my own to lead.  

I think the whole argument that somehow SL is threatened because of what's happening with Parler and Amazon is a bit over reactionary. If Parler would have removed the obvious calls for violence and killing of government officials from their pages this would not have happened. They've been catering to religious extremists and far right wing militia groups since their beginning and are now suffering the consequences.

  • Like 2
Link to comment
Share on other sites

18 minutes ago, Blush Bravin said:

I think the whole argument that somehow SL is threatened because of what's happening with Parler and Amazon is a bit over reactionary.

Yes, probably.  Because taking Amazon alone and it's content, there is no way to police all the Adult and graphic books that may go through Amazon's user content alone, let alone any other website.  This is perhaps why the internet is often referred to as the new  "Wild, Wild West" plus the internet was also full of pioneers at that time as well.  

Even with a cloud for all video games, I'd say it would still be difficult to find a real life adult endangering a real life child say on Fortnight for example.

I think governments wanted to clean up money laundering more than anything as well as to make sure people are filing taxes on their profit if over a certain amount as far as any changes that have happened to the internet somewhat recently.  

23 minutes ago, Blush Bravin said:

If Parler would have removed the obvious calls for violence and killing of government officials from their pages this would not have happened. They've been catering to religious extremists and far right wing militia groups since their beginning and are now suffering the consequences.

As far as this, yes agree.  

Link to comment
Share on other sites

33 minutes ago, Blush Bravin said:

I think the whole argument that somehow SL is threatened because of what's happening with Parler and Amazon is a bit over reactionary. If Parler would have removed the obvious calls for violence and killing of government officials from their pages this would not have happened. They've been catering to religious extremists and far right wing militia groups since their beginning and are now suffering the consequences.

Absolutely. And Amazon appears to have told them to clean up their act prior to making the decision to remove them. I'm going to guess based on my experience with AWS through work clients that it wasn't the first warning either. 

I don't see this as a threat to Second Life. If you report things in SL there's usually a response. We could argue all day about it being not enough or to our liking but that's sort of besides the point. 

The adult content is not a concern either btw. AWS hosts plenty of porn.

  • Like 3
Link to comment
Share on other sites

41 minutes ago, Bitterthorn said:

Absolutely. And Amazon appears to have told them to clean up their act prior to making the decision to remove them. I'm going to guess based on my experience with AWS through work clients that it wasn't the first warning either. 

I don't see this as a threat to Second Life. If you report things in SL there's usually a response. We could argue all day about it being not enough or to our liking but that's sort of besides the point. 

The adult content is not a concern either btw. AWS hosts plenty of porn.

Depends on the kind of porn. Virtual child porn is a crime in almost any other country on the planet. No one there cares that the ones who drive the avatars are adult, what matters is the criminal act and the images themselves.  Last year some brit was sentenced to a few years prison for spreading such smut (not even for creating it). And there are many, many more prosecutions and charges all over the  world.

And even if the servers are located in the US, as soon as section 230 falls the hosting companies can be held liable by interested groups located off the US. If big tech wants to save their safe harbor, they need to take responsibility and act according to common sense before something much worse than solid self-policing will get them and the entire internet into some existential trouble.

So no, adult content is a concern, and AWS will stop with hosting the excesses of that - which might deprave them of their business basics - anytime soon. Time is running out.

  • Like 1
Link to comment
Share on other sites

56 minutes ago, Vivienne Schell said:

Virtual child porn is a crime in almost any other country on the planet. No one there cares that the ones who drive the avatars are adult, what matters is the criminal act

What are you saying?  Virtual child pornography is legal in America?  Where do you get this evidence?  I've never heard of that!

Link to comment
Share on other sites

The Linden Lab TOS directly addresses misconduct with minors. It also outlines provisions for dealing with those who do not follow the guidelines. When LL receives an abuse report of such conduct the incident is investigated and dealt with according to the TOS.

Again, LL does not operate as Parler did/does where anything goes without restraints. Apples and oranges folks! In the great words of the Bard, "Much ado about nothing."

  • Like 2
Link to comment
Share on other sites

4 hours ago, Bitterthorn said:

It's important to understand that Amazon is not taking a moral stand. They are closing down Parler because they could become legally liable. This is about protecting their own hide. The laws in the US require platforms make a good faith effort towards moderation or be liable for user content, and a key part of Parler's image was that they did not moderate and they gave voices to people moderated off other platforms. They aren't making that 'good faith effort.' 

You can find terrible things on Facebook, YouTube, Reddit, and yes, SL but what all of those platforms have in common is a terms of service that they make a 'good faith' effort to uphold. If you report something, it might be removed. Parler wasn't removed from AWS until they refused to implement a moderation plan to remain on the Apple app store. 

The actual effectiveness of any of those platforms moderations is sort of beside the point. What matters from a liability standpoint is the effort is made.

Just the opposite. They're not legally liable at all, not as long as they have Section 230. There are a number of exceptions for specific laws and getting a warrant would have an affect, but overall, Amazon wouldn't be on the hook legally for what Parler users do.

3 hours ago, Blush Bravin said:

Good point! And with Section 230 under attack, social media services have to be very careful about what is represented on their platforms.

 

3 hours ago, JanuarySwan said:

I had never heard of Section 230 and just read a bit briefly.  What I find ironic is the timeline it came about which is just about Amazon's beginning timeline.  I imagine all those "used books" going through Amazon were not coloring books to say the least and as we American's say "if you get my drift" as to what I'm saying here.  All that goes through SL with it's possible billion and one items it could generate in a short time are impossible to "police".  I have a life of my own and I cannot be SL's unpaid police nor do I want to get involved because as I said, I have a life of my own to lead.  

 

3 hours ago, Blush Bravin said:

I think the whole argument that somehow SL is threatened because of what's happening with Parler and Amazon is a bit over reactionary. If Parler would have removed the obvious calls for violence and killing of government officials from their pages this would not have happened. They've been catering to religious extremists and far right wing militia groups since their beginning and are now suffering the consequences.

The big "problem" with Section 230 is that it enforces the idea that these platforms aren't responsible for their user's content. So when they then turn around and start to only allow content they approve of, they're taking responsibility for the content- should they still be protected by 230? That's why it's under attack. Or at least, I can understand that criticism. In part, I believe that's why LL allows just so much to go unchecked in SL, so they can benefit from 230's protections.

Typing that you're going to do something illegal should have consequences, but only for you, not the platform you do it on. That's why 230 is needed. Companies that ban stuff they don't like while benefitting from 230's protections though, that's a fun debate.

  • Like 3
Link to comment
Share on other sites

1 hour ago, Paul Hexem said:

Just the opposite. They're not legally liable at all, not as long as they have Section 230. There are a number of exceptions for specific laws and getting a warrant would have an affect, but overall, Amazon wouldn't be on the hook legally for what Parler users do.

Section 230 still requires providers to remove material illegal on a federal level. (Terrorism, CP, pirated movies, whatever) They lose the protections the moment they stop good faith moderation. It's not a 'you can't touch me' legal shield.

Did Parler have material that was federally illegal? Can you call what was on the platform that they refused to moderate sedition or terroristic threats? That's likely a question for actual lawyers and to be argued before the supreme Court. Section 230 is...problematic in a lot of ways, not just including the ones you mention. It is the reason we see YouTube take such a heavy handed response to claims of copyright infringement.If the data on Parler was ruled to be federally illegal then Amazon could be liable. That's not to say they would get named in the case, it's pretty untested legal territory for a cloud host. 

And I'm gonna guess Amazon doesn't want to take that chance. Not when the political balance of power is shifting. 

What does that mean for 230 going forward? Oh who knows. 

 

  • Like 2
Link to comment
Share on other sites

2 hours ago, JanuarySwan said:

What are you saying?  Virtual child pornography is legal in America?  Where do you get this evidence?  I've never heard of that!

April 2002:

 

Supreme Court strikes down portions of virtual child porn law

Supreme Court strikes down portions of virtual child porn law
  • Justices determined in a mixed opinion that the Child Pornography Prevention Act, designed to ban simulated sex of minors, stretched too far and could affect protected images in movies and on the Internet.

The Supreme Court on April 16 struck down portions of a federal child pornography law, ruling that the First Amendment protects the graphic manipulation of images even if it makes it appear that children are engaging in sex.

The Child Pornography Prevention Act of 1996 expanded current federal laws barring child pornography to include not only images of actual children engaging in explicit sexual conduct but also “any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture.”

But in its decision, the High Court determined that the law went to far. The decision struck down two provisions of the law outlawing visual materials that “appear to be a minor” or “conveys the impression” that a minor was involved. Parts of the law banning images using real minors had not been challenged.

“The First Amendment requires a more precise restriction,” wrote Justice Anthony Kennedy for a 6-3 majority that agreed to strike down both of the challenged provisions as overbroad. Justices John Paul Stevens, David Souter, Ruth Bader Ginsberg and Stephen Breyer joined Kennedy’s opinion. Justice Clarence Thomas wrote a concurring opinion.

Justice Sandra Day O’Connor agreed that the government could not outlaw making young-looking adults appear as children, giving that part of the opinion a 7-2 majority. But she sided with dissenters Chief Justice William Rehnquist and Justice Antonin Scalia in support of upholding the provision banning computer-generated images.

“The aim of ensuring the enforceability of our nation’s child pornography laws is a compelling one,” Rehnquist said. The law “is targeted to this aim by extending the definition of child pornography to reach computer-generated images that are virtually indistinguishable from real children engaged in sexually explicit conduct.”

The Free Speech Coalition, a trade association for the adult entertainment industry, challenged the law. The groups argued that the prohibition could encircle legitimate adult expression, making it a crime to depict a sex scene like those in the recent movies “Traffic” or “Lolita.” The Reporters Committee for Freedom of the Press joined a friend-of-the-court brief filed by the American Civil Liberties Union arguing that the law was unconstitutional.

 

(Ashcroft v. Free Speech Coalition) PT

I am not aware of a later decision, but maybe someone has a follow-up decision documented. As far as i know the 2002 decision is still valid.

 

Edited by Vivienne Schell
  • Like 1
Link to comment
Share on other sites

6 hours ago, Bitterthorn said:

Section 230 still requires providers to remove material illegal on a federal level. (Terrorism, CP, pirated movies, whatever) They lose the protections the moment they stop good faith moderation. It's not a 'you can't touch me' legal shield.

Yet, somehow known political figures still have twitter accounts after calling for the genocide of an entire race or call for acts of terrorism. Twitters response to foreign political figures is that it is 'sabre rattling' yet Trump gets banned.

Do they loose protection though if good faith moderation takes place for only selected people?

Section 230 provides immunity to any platform, company etc that hosts content created by its users and this immunity includes against threats, terrorism, sexual details and content, false information, defaming, pretty well much everything and anything. It is literally a 'you can't touch me' legal shield.

If Parler was used to organise the riot, then prey tell how can Facebook be held by court as immune to the user use of their platform in organising terrorism acts yet Parler was removed by big tech by falsely claiming that moderation is needed on their platform when section 230 would apply even to Parler?

Amazons Terms of Use is why Parler was removed from the platform as it it specifically states that you can not use their servers for illegal activity. That said if Amazons terms of use didn't have such a clause then even if Parler showed how to make a nuclear bomb posted by one of its users, both Parler and Amazon would be immune under Section 230.

The same would apply to LL, Amazon and LL are immune to prosecution under section 230 of content created and posted by their users, however if Parler is removed due to going against Amazons Terms then LL with Second Life could be equally removed based on content and things that go in in Second Life.

Edited by Drayke Newall
found the clause that Bitterthorn was talking about so edited it as such.
  • Like 5
Link to comment
Share on other sites

1 hour ago, Drayke Newall said:

 if Parler is removed due to going against Amazons Terms then LL with Second Life could be equally removed based on content and things that go in in Second Life.

Second Life could be removed for for whatever, like for "being nasty", while "nasty" is what AWS says "nasty" is and includes the definition of that into their ToS and contracts. No one can force the Washington Post into publishing a Trump speech, for example, and that´s not what section 230 regulates/deregulates.Section 230 ony makes it merely impossible to sue a social media company/network/corporation in a civil court for whatever, even if a  crime commited by a user is proven  - it´s about liability and money in the end, and more or less only about that.

  • Like 3
Link to comment
Share on other sites

1 hour ago, Vivienne Schell said:

Second Life could be removed for for whatever, like for "being nasty", while "nasty" is what AWS says "nasty" is and includes the definition of that into their ToS and contracts. No one can force the Washington Post into publishing a Trump speech, for example, and that´s not what section 230 regulates/deregulates.Section 230 ony makes it merely impossible to sue a social media company/network/corporation in a civil court for whatever, even if a  crime commited by a user is proven  - it´s about liability and money in the end, and more or less only about that.

That is/was my point. It was the Terms of Use that made Parler go. Which is also related to the thread title. Was uplift a bad idea? Especially now knowing with precedent that Amazon can terminate SL if they find out it goes against their terms (which in theory it does)? Section 230 protects both LL and Amazon from the content it hosts of its users, it doesn't protect their reputation. Hence why Parler was axed.

  • Like 2
Link to comment
Share on other sites

7 hours ago, Vivienne Schell said:
 

(Ashcroft v. Free Speech Coalition) PT

I am not aware of a later decision, but maybe someone has a follow-up decision documented. As far as i know the 2002 decision is still valid.

this Supreme Court ruling says that simulations/'depictions of child pornography not involving actual children are protected by the 1st amendment provided that the simulation/depiction is not legally obscene.  Is the legally obscene bit that gets skipped over sometimes

in the USA there is no federal standard for legally obscene and the US Supreme Court has in quite a number of cases ruled that the definition of what is legally obscene is a matter for the lower federal and state courts to determine on a case by case taking into account contemporary societal morality as it may be practiced within the jurisdiction of the court hearing the case, until such time as there is a federal legal standard

on this basis most US public facing companies/providers go with the if-it-looks-like-a-child then it's obscene. A person so banhammered could challenge the company's understanding of contemporary societal morality as it relates to them simulating child pornography on the company's platform but the further obstacle for such a person is the company's Terms of Service when that says simulating child pornography with child-looking avatars is not permitted

  • Like 1
Link to comment
Share on other sites

14 hours ago, Vivienne Schell said:

“appear to be a minor” or “conveys the impression” 

This is peculiar.  Appear to be a minor or conveys the impression is extremely odd but makes me think of "Lolita" in particular and I mean just the movie here.  In the movie "Lolita" there are no actual sex acts between the two main characters, only implied or conveyed.  But, I'm going from memory here regarding the movie.

So, I don't understand this "law" and I don't think it relates to non-implied forms of illegal child pornography whether virtual or not.

There is a Wiki on being a child avatar in SL.  One is allowed to be a child avatar engaging in playing with toys and other childhood things, and/or be a tiny as tinies are not considered child avatars.  Tinies are small animal avatars here in SL that are like humans in that they walk, wear clothing, etc...

I also read in another Wiki from Linden Lab that if you don't know what you are or aren't allowed to do as a child avatar it is better not to be one at all.  

http://wiki.secondlife.com/wiki/Child_Avatar

 

Edited by JanuarySwan
Link to comment
Share on other sites

55 minutes ago, Poxik Frostbite said:

I've been in Second Life for over 11 years. It was a good run. The time has come to build a better virtual space from scratch.

Many people have tried, without much success. There's been much activity in the last few years.

There was the era of the "game level loaders" - creators build their own "game level", upload, and others can go visit. Sansar, High Fidelity, and SineSpace are all in that category. It failed abysmally. None of those were able to average more than about 20-50 concurrent users. (SL runs about 30,000 to 50,000 average concurrent users. For reference, GTA V Online is slightly above that.) So that was a dead end.

Then there were the VR-first worlds. Facebook tried hard at this. Facebook Rooms (failed), Facebook Spaces (failed), and Facebook Horizon (live, but not taking off.) VRChat usage seems to be flat. VRChat has a good implementation, so the technology doesn't seem to be the obstacle. Lesson: throwing money at this does not guarantee success.

There are the cyptocurrency worlds, Decentraland, Sominium Space, and Upland. Mostly, people trade land hoping to get rich, and don't go in-world much. Upland didn' t bother to build a virtual world at all. Upland uses Tilia for their connection to real-world money.

There are the voxel worlds, Roblox and Dual Universe. These are doing OK. They're not as good looking as SL, especially on the avatar side. Roblox, after over a decade, finally took off. Lesson: it takes a long time to reach critical mass in a system where the users make the content.

Dual Universe is a multi-planet system with both space travel and ground activity. You pay by the month, but there are no land charges. Imagine SL where, to get prims, you had to go to big open spaces and dig for them. They started with empty planets and waited to see what happened.

Users were supposed to seek, mine, and trade. But they made it too much like real life. Users could build up huge mechanized mining complexes and extract resources in large quantities. So they changed the rules to prevent that, adding some totally arbitrary restrictions. This infuriated the users who'd been building. SL's Luca Grabacr has been active there and has videos on YouTube about that issue.

There were the Spatial OS worlds. Spatial OS, from Improbable, is a system built at great cost to support very large seamless worlds. It looked like the technical solution to many of SL's problems. But it's very expensive to run, and the first three big games built on it have already shut down. One game from China, Nostos, is still up. Artwork is great, gamers report they can clear the game in three hours and then are done. The trouble with really big professionally created worlds is that you need a huge team to create them and budgets the size of a major action film.

A virtual world needs a large number of good creators and active residents to work. It's very, very difficult to get that from a cold start. Even throwing a half billion dollars at the problem, as Improbable and Facebook did, does not guarantee success. Second Life's main asset is that it has that.

The world has to progress, or people get bored and leave.

 

 

Edited by animats
  • Like 5
Link to comment
Share on other sites

  • 9 months later...

I’m just going say this and then I’m gonna disappear for another year or two:

 

We just went through something that’s never happened before in human history during a time where we had this kind of technology, second life blew one of the biggest opportunities they’ve ever had (during the pandemic). Instead companies like zoom took over. Second life had the market cornered and they still screwed it up.  Now Facebook has their own virtual world coming out, which we all know will be done better because linden seems to hate its customers even more than Facebook.  
 

Back when I ran my consulting corporation I tried my best to help Linden not fail, but I couldn’t even get these people to return my phone calls.  This company is just way too pigheaded, they think they know everything, as bad as Roberts with star citizen.

 

Second life has wasted opportunity after opportunity; at the end of the day when the company fails it they only have themselves to blame. They have what could be an incredible platform, they have a great name, they have all the foundations, a virtual currency on the actual stock market, and they keep royally screwing things up. It’s like they want to fail.

 

And no I couldn’t be bothered to read the rest of this thread because it sounded like a bunch of lunatic one sided political extremists; and they’ve already ruined everywhere on the Internet including Twitter, Reddit, the news media, etc.  No one has time for the children, whether they’re actually a child age or not the entire party is nothing but a bunch of ignorant idealist children.

Edited by Etalia Cristole
  • Like 1
  • Haha 1
  • Confused 2
Link to comment
Share on other sites

6 hours ago, Etalia Cristole said:

I’m just going say this and then I’m gonna disappear for another year or two:

 

We just went through something that’s never happened before in human history during a time where we had this kind of technology, second life blew one of the biggest opportunities they’ve ever had (during the pandemic). Instead companies like zoom took over. Second life had the market cornered and they still screwed it up.  Now Facebook has their own virtual world coming out, which we all know will be done better because linden seems to hate its customers even more than Facebook.  
 

Back when I ran my consulting corporation I tried my best to help Linden not fail, but I couldn’t even get these people to return my phone calls.  This company is just way too pigheaded, they think they know everything, as bad as Roberts with star citizen.

 

Second life has wasted opportunity after opportunity; at the end of the day when the company fails it they only have themselves to blame. They have what could be an incredible platform, they have a great name, they have all the foundations, a virtual currency on the actual stock market, and they keep royally screwing things up. It’s like they want to fail.

 

And no I couldn’t be bothered to read the rest of this thread because it sounded like a bunch of lunatic one sided political extremists; and they’ve already ruined everywhere on the Internet including Twitter, Reddit, the news media, etc.  No one has time for the children, whether they’re actually a child age or not the entire party is nothing but a bunch of ignorant idealist children.

ok

 

6 hours ago, Etalia Cristole said:

Back when I ran my consulting corporation I tried my best to help Linden not fail...

Just one question in regards to that part, did you get paid to do that?  (because if you didn't.... hmm.)

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 879 days.

Please take a moment to consider if this thread is worth bumping.

Guest
This topic is now closed to further replies.
 Share

×
×
  • Create New...