Jump to content

Tomos Halsey

Resident
  • Posts

    40
  • Joined

  • Last visited

Everything posted by Tomos Halsey

  1. The wiki shows things are happening, this is exciting! https://wiki.secondlife.com/wiki/INVENTORY_MATERIAL https://wiki.secondlife.com/wiki/CHANGED_RENDER_MATERIAL https://wiki.secondlife.com/wiki/PRIM_RENDER_MATERIAL https://wiki.secondlife.com/wiki/LlSetRenderMaterial https://wiki.secondlife.com/wiki/LlSetLinkRenderMaterial https://wiki.secondlife.com/wiki/LlGetRenderMaterial
  2. Sorry for the necro bump but I too was looking for this information, since the wiki is old and out of date and nobody seems to know I figured the only way to find out this information was to dig through the source code of the viewer directly and paste the results. TL;DR: Any content mimetype that gets handled by the "web" widget type in the table below will get piped through to cef or webkit plugins, they will dictate the limitations of what the viewer can generate through Shared Media so you could probably load up https://caniuse.com/ and check on a prim in world. LONG Version To start lets head on over to the repo at https://bitbucket.org/lindenlab/viewer/src/master/ and this is where we will be looking for clues. Glancing into the 'viewer/indra/media_plugins' folder there is a base plugin other things can inherit, the cef(chromium embedded framework) plugin for prim media, a gstreamer010 / libvlc plugin for handling the Parcel Media per platform. These are going to be the available plugins the viewer can use for handling media requests via mimetype mapping through 'viewer/indra/newview/llmimetypes.cpp'. If we take a look at 'viewer/indra/newview/llmimetypes.h' a comment indicates that an XML file is loaded with the lookup information for each mime type the viewer can recognize somewhere in the viewers 'skins' folder, so off to the skins folder on our local machine! Searching inside the viewers skins folder you can see that there are many different 'mime_types' XML files scattered throughout each skin(if other skins are present), each language and even per platform (ie: mime_types_mac.xml, mime_types_linux.xml)...this is going to get messy so I will stick with english on windows. This is the list inside 'skins/default/xui/en/mime_types.xml' on the official Linden Lab SL viewer 6.6.4.575022 (64bit) on a Windows machine. Mimetype Plugin Widget Type blank media_plugin_cef none none/none media_plugin_cef none audio/* media_plugin_libvlc audio video/* media_plugin_libvlc movie image/* media_plugin_cef image video/vnd.secondlife.qt.legacy media_plugin_libvlc movie application/javascript media_plugin_cef web application/ogg media_plugin_cef audio application/pdf media_plugin_cef image application/postscript media_plugin_cef image application/rtf media_plugin_cef image application/smil media_plugin_cef movie application/xhtml+xml media_plugin_cef web application/x-director media_plugin_cef image audio/mid media_plugin_cef audio audio/mpeg media_plugin_libvlc audio audio/x-aiff media_plugin_libvlc audio audio/x-wav media_plugin_libvlc audio image/bmp media_plugin_cef image image/gif media_plugin_cef image image/jpeg media_plugin_cef image image/png media_plugin_cef image image/svg+xml media_plugin_cef image image/tiff media_plugin_cef image text/html media_plugin_cef web text/plain media_plugin_cef text text/xml media_plugin_cef text video/mpeg media_plugin_libvlc movie video/mp4 media_plugin_libvlc movie application/octet-stream media_plugin_libvlc movie video/quicktime media_plugin_libvlc movie video/x-ms-asf media_plugin_libvlc movie video/x-ms-wmv media_plugin_cef movie video/x-msvideo media_plugin_libvlc movie I'd imagine that if someone added a new mapping to this list a new mimetype could be added and piped through the plugins as well (assuming they can handle the datastream). OH I almost forgot to mention, Shared Media would be the "web" widget type so anything under there should be playable by all Linden viewers.
  3. Thanks for trhe kind words I wanted to leave the interpolation in so that people could add their own final polish on it. This is for other scripters to use as a base. I am using it also for tree roots and a dynamic pathway in a treehouse so lightning isn't the only thing it has to look like. My product version hides and then shows after they are constructed. Since the UV space on my bolts is really compact I scroll a texture and that causes the flashing but in this it's a technical concept to expand on.
  4. I just finished this little project for my localized weather system and thought others could use it too. I wanted a 3D lightning that sprouts torward an object and noticed there wasn't any I could locate. I tried to place it on the wiki but it seems that I can't edit anything anywhere so not sure what that's all about. So I figured the forums would be a nice place to put it. Example Instructions Create a ton of blocks and name them "segment". Link them to a root prim of some sort and name it anything. Drop this script into the root prim and away we go! // 2017 written by To-mos Codewarrior (tomos.halsey) // Procedural 3D lightning based on XNA tutorial // https://gamedevelopment.tutsplus.com/tutorials/how-to-generate-shockingly-good-2d-lightning-effects--gamedev-2681 integer LINK_currentSeg; list LINK_segments; //increment segment link number integer incLink() { integer returnLink = llList2Integer(LINK_segments, LINK_currentSeg); LINK_currentSeg++; if( LINK_currentSeg > llGetListLength(LINK_segments) ) { //llOwnerSay("reached end of segments: "+(string)LINK_currentSeg); LINK_currentSeg = llGetListLength(LINK_segments)-1; return -2; } return returnLink; } //generate and return prim parameters for each lightning segment list placeSegment( vector v1, vector v2, float thickness ) { //generate rotations between the vectors vector offset = v1 - v2; rotation rot = llEuler2Rot(<0.0,llAtan2(-offset.z,offset.x),0.0>); //localize offset to perspective of y rot offset /= rot; return [ PRIM_POS_LOCAL, ( v1 + v2 ) * 0.5,//solve for z rotation locally to y PRIM_ROT_LOCAL, llEuler2Rot(<0.0,0.0,llAtan2(offset.y,offset.x)>)*rot, PRIM_SIZE, <llVecDist(v1, v2), thickness, thickness> ]; } //dissect the parameters list into sections based on metadata and //return the segment position based on percentage of whole bolt index range vector getBoltPos( list primParams, integer boltNumber, float segmentPercent ) { //first index metadata for bolts if( llGetListEntryType( primParams, 0 ) != TYPE_STRING ) { llOwnerSay("getBoltPos(): Supplied parameters have no metadata."); return ZERO_VECTOR; } list boltMeta = llParseString2List( llList2String( primParams, 0 ), [","], [] ); if( boltNumber > llGetListLength( boltMeta ) - 1 ) { llOwnerSay("getBoltPos(): Bolt index \"" + (string)boltNumber + "\" is out of range"); return ZERO_VECTOR; } string boltIndexRange = llList2String( boltMeta, boltNumber ); if( boltIndexRange == "" ) { llOwnerSay("getBoltPos(): No metadata for bolt: \"" + boltIndexRange + "\""); return ZERO_VECTOR; } //find delimiter and use it integer splitIndex = llSubStringIndex( boltIndexRange, ":" ); //For those wondering, if the return value is -1, then ~-1 is 0 because -1 is a string of all 1 bits. //Any value greater than or equal to zero will give a non-zero result. if( ~splitIndex ) { //start index integer startIndex = (integer)llGetSubString( boltIndexRange, 0, splitIndex-1 ); //end index integer endIndex = (integer)llGetSubString( boltIndexRange, splitIndex+1, -1 ); //get the range of the indices integer indexRange = endIndex - startIndex; //grab an index based off a percentage llFloor integer segmentIndex = (integer)( ( indexRange / 8.0 ) * segmentPercent ); //grab the segment chain representing a single bolt //lets reuse the variable to prevent extra allocation boltMeta = llList2List( primParams, startIndex, endIndex ); //return our located position if( 8*segmentIndex+3 > llGetListLength( boltMeta )-1 ) { llOwnerSay("getBoltPos(): Segment index \"" + (string)(8*segmentIndex+3) + "\" of bolt \"" + (string)boltNumber + "\" is out of range"); }else { return llList2Vector( boltMeta, 8*segmentIndex+3); } }else { llOwnerSay("getBoltPos(): Malformed metadata: \"" + boltIndexRange + "\""); } return ZERO_VECTOR; } //meat and potatoes of this script. This takes a start and end local position //with a thickness and generates prim link params representing a single lightning bolt list createBolt( vector source, vector dest, float thickness ) { list results = [ 0.0 ]; vector tangent = dest - source; vector normal = llVecNorm( tangent ); float boltMag = llVecMag( tangent ); float Sway = 5.0;//meters float Jaggedness = 1.0 / Sway; integer i = 0; integer points = llRound( boltMag / 3.0 ); if( points < 1.0 ) { //llOwnerSay("Bolt not long enough: "+(string)points); return []; } //random points between 0.0 (0%) to 1.0 (100%) for( ;i < points; i++ ) results += [ llFrand( 1.0 ) ]; //now sort the points so we have randomness from A to B results = llListSort( results, 1, TRUE ); //store length before adding on optional random points integer length = llGetListLength( results ); vector prevPoint = source; vector prevDisplacement = ZERO_VECTOR; integer linkNum; for ( i = 1; i < length; i++ ) { //since we are set inside a range from the for loop //the first few indices will work float pos = llList2Float( results, i ); // used to prevent sharp angles by ensuring very close //positions also have small perpendicular variation. float scale = ( boltMag * Jaggedness ) * ( pos - llList2Float( results, i - 1 ) ); //great way to make circular bullet spray //to all those weapon makers out there :) float sprayAng = llFrand( TWO_PI ); float sprayScale = Sway + llFrand( -Sway*2.0 ); vector displacement = <llCos(sprayAng)*sprayScale, llSin(sprayAng)*sprayScale, 0.0>; displacement.x -= (displacement.x - prevDisplacement.x) * (1.0 - scale); displacement.y -= (displacement.y - prevDisplacement.y) * (1.0 - scale); // defines an envelope. Points near the middle of the bolt can be further from the central line. if( pos > 0.95 ) displacement *= 10.0 * (1.0 - pos); //project a point from A to B then //align the circular spray to the normal with a cross product vector point = source + pos * tangent + displacement % normal; //grab link and then increment it linkNum = incLink(); if( linkNum != -2 ) { results += [ PRIM_LINK_TARGET, linkNum ]+placeSegment(prevPoint, point, thickness); } prevPoint = point; prevDisplacement = displacement; } //clean up the list by removing the //prefixed random points we no longer need results = llDeleteSubList( results, 0, length-1 ); //grab link and then increment it linkNum = incLink(); //concatenate last bolt params and return it if( linkNum != -2 ) { return results + [PRIM_LINK_TARGET, linkNum] + placeSegment(prevPoint, dest, thickness); }else { return results; } } //generate one single bolt and then tack on a random amount //based on a start and end pos and bolt thickness list generateLightning( vector startingPoint, vector endingPoint, float thickness ) { LINK_currentSeg = 0; list boltParams = createBolt( startingPoint, endingPoint, thickness ); //tack on some metadata; index may start at 0 but since meta data //is there don't -1 from get list length string metaData = "1:"+(string)(llGetListLength(boltParams)); boltParams = [metaData] + boltParams; list randNum = []; //3 to 10 random bolts integer numOfRandBolts = llRound( 10.0 - llFrand( 7.0 ) ); integer i = 0; //random points between 0% to 100% for( ;i < numOfRandBolts; i++ ) randNum += [ llFrand( 1.0 ) ]; //now sort the points so we have randomness from A to B randNum = llListSort( randNum, 1, TRUE ); //build perspective to target we are striking //so the 30 degrees is localized to that target vector tangent = endingPoint - startingPoint; //generate rotations between the vectors rotation rotToAvi = llEuler2Rot(<0.0,llAtan2(tangent.x,tangent.z),0.0>); //localize offset to perspective of y rot tangent /= rotToAvi; //build us the final angle rotToAvi = llEuler2Rot(<PI+llAtan2(-tangent.y,tangent.z),0.0,0.0>)*rotToAvi; //generate each child bolt for ( i=0; i < numOfRandBolts; i++ ) { //read the bolt metadata and get a random segment pos vector spawnPos = getBoltPos( boltParams, 0, llList2Float( randNum, i ) ); //use the scale of the overall bolt to //parametrize the child lengths //(this is a bit performance heavy so adjust accordingly) float distance = llVecMag( endingPoint - spawnPos ); //rotate 30 degrees(in radians). Alternate between rotating left and right. //generate child lightning from 30% to 70% length vector posOut = spawnPos - <0.0, 0.0, distance> * (rotToAvi * llEuler2Rot( <llFrand(1.5708)-0.785398, llFrand(1.5708)-0.785398 ,0.0> )); //grab the length before for start index and diff to end integer beforeLen = llGetListLength(boltParams); //comma and start index of thee bolt params metaData += ","+(string)beforeLen; //actually generate and store the params for child bolts boltParams += createBolt( spawnPos, posOut, thickness ); //add colon to separate data later and index of end of bolt params metaData += ":"+(string)(beforeLen+(llGetListLength(boltParams) - beforeLen)-1); //now store the altered metadata back onto the list boltParams = [metaData] + llDeleteSubList( boltParams, 0, 0 ); } //we are done with the metadata return llDeleteSubList( boltParams, 0, 0 ); } default { state_entry() { integer length = llGetNumberOfPrims(); for(; length > 1; length--) { string name = llGetLinkName( length ); if( name == "segment" ) { LINK_segments += [ length ]; } llSetLinkPrimitiveParamsFast(length,[PRIM_COLOR, ALL_SIDES, <1.0,1.0,1.0>, 1.0,PRIM_SIZE,<0.2,0.2,0.2>,PRIM_LINK_TARGET,LINK_THIS,PRIM_TEXT,(string)(llGetFreeMemory()*0.000977)+"kb",<1.0,1.0,1.0>,1.0]); } vector aviPos = llList2Vector(llGetObjectDetails(llGetOwner(),[OBJECT_POS]),0); vector localOffset = (aviPos - llGetRootPosition())/llGetRootRotation(); //list params = createBolt( ZERO_VECTOR, localOffset, 0.2 ); //typical bolt structure //PRIM_LINK_TARGET, linkNumOfBoltSegment, // PRIM_POS_LOCAL, <-9.312323, -9.211586, -38.696487>, // PRIM_ROT_LOCAL, <0.299013, -0.810049, -0.174666, 0.473185>, // PRIM_SIZE, <2.527075, 0.100000, 0.100000> list params = generateLightning( ZERO_VECTOR, localOffset, 0.1 ); //print the params for debug /*length = llGetListLength(params)/8; integer i = 0; for( ; i<length; i++ ) { llOwnerSay("param: "+llList2CSV( llList2List( params, i*8, i*8+7 ) ) ); }*/ llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_TEXT,(string)(llGetFreeMemory()*0.000977)+"KB",<1.0,1.0,1.0>,1.0]+params); llSetTimerEvent(4.0); } //make single bolt of lightning for testing touch_start(integer total_number) { LINK_currentSeg = 0; llSetLinkPrimitiveParamsFast(LINK_ALL_CHILDREN,[PRIM_POS_LOCAL,ZERO_VECTOR,PRIM_SIZE,<0.2,0.2,0.2>]); vector aviPos = llDetectedPos(0); vector localOffset = (aviPos - llGetRootPosition())/llGetRootRotation(); list params = createBolt( ZERO_VECTOR, localOffset, 0.2); //list params = generateLightning( ZERO_VECTOR, localOffset, 0.1 ); llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_TEXT,(string)(llGetFreeMemory()*0.000977)+"KB",<1.0,1.0,1.0>,1.0]+params); } timer() { llSetLinkPrimitiveParamsFast(LINK_ALL_CHILDREN,[PRIM_POS_LOCAL,ZERO_VECTOR,PRIM_SIZE,<0.2,0.2,0.2>]); vector aviPos = llList2Vector(llGetObjectDetails(llGetOwner(),[OBJECT_POS]),0); vector localOffset = (aviPos - llGetRootPosition())/llGetRootRotation(); list params = generateLightning( ZERO_VECTOR, localOffset, 0.1 ); llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_TEXT,(string)(llGetFreeMemory()*0.000977)+"KB",<1.0,1.0,1.0>,1.0]+params); } }
  5. The Second Life viewers utililize the Blinn/Phong lighting model to handle their scene rendering. You are trying to use Substance's Physical Based Rendering materials in SL's old material pipeline. You will need to tell the program to generate older maps for sl. Specifically Diffuse, Specular, and Normal maps. I only use Substance Painter for PBR texturing but when I needed to export to SL I had to setup a custom export option to accomplish this task. Here is a nice screenshot to show how that was accomplished.
  6. Unfortunately none of the DAE files in the examples show signs of "sharing the same polygons". If we take a look at the COLLADA Documentation here it dictates that geometry is defined inside of the <mesh> tags wich contain sets of <triangles> tags. These tags get assigned materials by defining a material attribute that corresponds with a defined <library_materials> tag further up in the file. So with that being said if we look into the files provided we can see two sets of <triangles> inside the <mesh> tags. This means that there is litteratly TWO sets of identical mesh with two seperate materials assigned so what does this mean? It means that this isn't an SL "bug" but an exporter bug that writes a DAE file with double the polys in overtop of the previously defined set. SO the uploader isnt "glitching" or "bugging" at all but just reading a normal COLLADA file with two sets of gerometry and two material faces. Here is our proof that the example files from the youtube page are just normal COLLADAS. <library_materials> <material id="ccc" name="ccc"> <instance_effect url="#ccc-fx"/> </material> <material id="bbb" name="bbb"> <instance_effect url="#bbb-fx"/> </material></library_materials> And the corresponding triangles the materials are being assigned to...(i truncated a little to save space) <library_geometries> <geometry id="wall_test-lib" name="wall_testMesh"> <mesh> <source id="wall_test-POSITION"> <float_array id="wall_test-POSITION-array" count="96">-146.407776 -118.058250 0.000000146.407776 -118.058250 0.000000-146.407776 118.058250 0.000000146.407776 118.058250 0.000000-86.165428 19.481628 0.000002-86.165428 -19.481628 -0.00000286.165428 -19.481628 -0.00000286.165428 19.481628 0.000002[...]</float_array> <technique_common> <accessor source="#wall_test-POSITION-array" count="32" stride="3"> <param name="X" type="float"/> <param name="Y" type="float"/> <param name="Z" type="float"/> </accessor> </technique_common> </source> <source id="wall_test-Normal0"> <float_array id="wall_test-Normal0-array" count="522">0.000000 0.000000 1.0000000.000000 0.000000 1.0000000.000000 0.000000 1.0000000.000000 0.000000 1.0000000.000000 0.000000 1.0000000.000000 0.000000 1.0000000.000000 0.000000 1.0000000.000000 0.000000 1.000000[...]</float_array> <technique_common> <accessor source="#wall_test-Normal0-array" count="174" stride="3"> <param name="X" type="float"/> <param name="Y" type="float"/> <param name="Z" type="float"/> </accessor> </technique_common> </source> <source id="wall_test-UV0"> <float_array id="wall_test-UV0-array" count="168">0.000000 0.0000000.638178 0.0000000.371744 0.1628240.266434 0.1628240.894690 0.4951810.638178 0.7914250.198716 0.9196800.093406 0.919680[...]</float_array> <technique_common> <accessor source="#wall_test-UV0-array" count="84" stride="2"> <param name="S" type="float"/> <param name="T" type="float"/> </accessor> </technique_common> </source> <vertices id="wall_test-VERTEX"> <input semantic="POSITION" source="#wall_test-POSITION"/> </vertices> <triangles count="58" material="ccc"> <input semantic="VERTEX" offset="0" source="#wall_test-VERTEX"/> <input semantic="NORMAL" offset="1" source="#wall_test-Normal0"/> <input semantic="TEXCOORD" offset="2" set="0" source="#wall_test-UV0"/> <p>9 0 9 10 1 80 11 2 81 11 3 81 8 4 8 9 5 9 2 6 0 0 7 1 5 8 2 5 9 2 4 10 3 2 11 0 [...]</p> </triangles> <triangles count="58" material="bbb"> <input semantic="VERTEX" offset="0" source="#wall_test-VERTEX"/> <input semantic="NORMAL" offset="1" source="#wall_test-Normal0"/> <input semantic="TEXCOORD" offset="2" set="0" source="#wall_test-UV0"/> <p> 9 0 9 10 1 80 11 2 81 11 3 81 8 4 8 9 5 9 2 6 0 0 7 1 5 8 2 5 9 2 4 10 3 2 11 0 [...]</p> </triangles> </mesh> </geometry> </library_geometries> Yeah, so overall it looks like there is no Black Magic or Magical Bugs that allow for this. It's just a simple file with duplicated polys that have two seperate materials on them.
  7. Not to revive an old thread but I was hoping to help some people. I have continued the project in my own Fork on github and have added code block highlighting and most if not all of the current functions listed on the lsl wiki. I still need to enter the default inputs for some functions and events but I will try and keep it up to date as new functions and what not come out. https://github.com/To-mos/sublime-lsl-syntax
  8. Yes Intel Iris will run fine on sl and at a massive resolution too!. If you plan on using sl on windows through bootcamp you will need to modify the gpu_table.txt file for your viewer to recognize the iris card. I wrote a guide on doing this here. I actually wrote this post from my intel iris mac on the bootcamp side so thats how often I use the iris mac for sl stuff I get about 80 FPS on ultra on firestorm 64 bit edition using an intel Iris 5100 so its great for development on the go. If you are using the mac version make sure to set your resolution for the window at login under debug prefrences for window resolution.
  9. I forgot to mention though, there is also a gpu_table.txt in the root of the viewer folder I have to change sometimes when the viewer wont read the one in the "roaming" folder. Try going to the folder the viewer is in and look there for the gpu_table.txt
  10. I'm terribally sorry, I do not use widows 8.0 or 8.1 so I will have to find a copy and change it for you. Hope you have better luck with it
  11. You will want to use a more advanced text editor that supports the encoding format of the txt file. I just used Visual Studio C++ and it looked fine probably because they used it. I got them to fix it in the more recent versions of the viewer but it looks like it's been set back. I left this up for refrence because its hard to get the freaking firestorm team to listen.
  12. So I got a mac and loaded windows on it because my software selection doesn't have much mac support. I would login to the sl or firestorm viewer to find that my iris gpu wasn't being detected. After looking through the gpu_table.txt file to see that it had a listing for the intel iris card it was strange that the card wasn't being picked up. I looked into the log of the viewer to see that it was looking for a card with (TM) in the name. I modified the regex in the gpu_table to ignore the (TM) in the graphics card name and voila, it works. So the file you will want to look for is either gpu_table.txt in the viewers root folder or, somewhere in your AppData/Roming/Viewername location. Just open the file and replace the lines with Intel Intel From This... Intel Intel Iris Graphics 5100 .*Intel.*Iris Graphics 51.* 4 1 0 4 Intel Intel Iris Pro Graphics 5200 .*Intel.*Iris Pro Graphics 52.* 4 1 0 4 To This... Intel Intel Iris Graphics 5100 .*Intel.*Iris.* Graphics 51.* 4 1 0 4 Intel Intel Iris Pro Graphics 5200 .*Intel.*Iris.* Pro Graphics 52.* 4 1 0 4
  13. It's a very handy effect, I've already used it for swords with trails and vehicle tracks. I'm looking for a way to use it for foot prints and bullet decals since the particle follows the orientation of the prim doing the rezzing. Here is a quick demo on getting it working like the picture provided earlier. llParticleSystem([PSYS_PART_FLAGS,PSYS_PART_RIBBON_MASK|PSYS_PART_INTERP_SCALE_MASK|PSYS_PART_INTERP_COLOR_MASK|PSYS_PART_EMISSIVE_MASK,PSYS_SRC_PATTERN, PSYS_SRC_PATTERN_DROP,PSYS_PART_BLEND_FUNC_SOURCE, PSYS_PART_BF_SOURCE_ALPHA,PSYS_PART_BLEND_FUNC_DEST, PSYS_PART_BF_ONE_MINUS_SOURCE_ALPHA,PSYS_SRC_BURST_SPEED_MIN,0.01,PSYS_SRC_BURST_SPEED_MAX,0.01,PSYS_PART_MAX_AGE,4.0,PSYS_SRC_BURST_PART_COUNT,2,PSYS_SRC_ACCEL,<0.0,0.0,0.0>,PSYS_PART_START_GLOW,1.0,PSYS_PART_END_GLOW,0.0,PSYS_PART_START_SCALE,<0.5,0.5,0.0>,PSYS_PART_END_SCALE,<0.5,0.5,0.0>,PSYS_PART_START_COLOR,<1.0,0.0,0.0>,PSYS_PART_END_COLOR,<0.996,0.890,0.192>]);
  14. Ok let me straighten this out...YES it's a png image and YES it's an executable. They have embeded the exe inside the png just like those age old jpg exe embedding. What you do is stick the hex representation of the program in the png's header so when they open the file it executes the program to infect the machine. This is quite an old way of infecting people but this is a way that members of 4chan like to pass executables across the image board. http://www.cyberengineeringservices.com/malware-obfuscated-within-png-files-sample-2-2/
  15. What you want to do is first select the plane, then select Rendering from the top, click Render to Texture and then click "add" in the Outputs window and select ShadowsMap. This should allow you to bake a transparent shadow map onto the plane. I use it for overlaying shadows on camera tracking CGI backdrops --11 years 3dsMax CGI/Game modeler
  16. Yeah I'm with them the hair can be dropped quite a LOT actually. The game design teams I've worked with in the past tend to UV map some bent planes and move them around over the model and use transparent hair textures. It allows for a low texture overhead on the rendering buffer and it also looks nice with spec and normal materials. The display impact and land impact can be effected not only by vert count but by texture resolution. The land impact takes into consideration all the LOD's and the display impact generated from the high resolutions. Taking this into consideration will allow you to drop your land impact drasstically later on down the road for non worn items.
  17. To answer your question YES you can. The latest copybot viewers support full mesh ripping and they work. I'll break it down into detail how they rip mesh. There are things known as render rippers and what they do is tap into the render pipeline of the library they are using for the sofware ie Direct X or OpenGL. There is a Direct X ripper that does just this and can pull objects from any Direct X 9,10,11 engine. As for the viewers the programmers take an existing viewer and compile them with their own homebrewed ripping code that reads the contents of the render buffer and puts them out into an obj file. Now dealing with the perspective of the rendering camera and the matricies required to skew the view onto a flat monitor screen can messup the mesh object when its ripped. When this happenes the ripped model file will be skewed and will require cleanup before they can use the mesh. So basically you can rip mesh but it requires a bit of cleanup before you can just go and re-upload it.
  18. Here is something I started on for 3ds Max users to export and import .anim files directly to and from SL so they have full control over bone priority and hopefully can animate attachments too if everything goes as planned.
  19. I'm so glad someone else is conserned about polycount on their models That was why sl took so many resources is because of the ineffeciency of prims/sculpts. Each sculpt you use is 2048 more polygons than the last and prims themselves take many to build some of the simplest shapes so its great you are using mesh to its full potential
  20. After a few years on sl this spider is the only one of its kind i've ever seen on sl. There are some other non-physical spider mechs in sl(one actually inspired me to make this one) but none of them can jump, hang off rope, shoot, or be towed via air craft. I have been inspired by a few people on here one of them being Nexii Malthus(one of the other mech creators) and they have shown me the possibilities of sl. I wanted to better the world of sl from viewers to models, to even freebie scripts and examples I've added to the wiki to allow people to learn and raise the standard of sl. I had to fight to learn what I know now here because every script (crappy or good) was hidded behind a purchase or paywall. I decided to start writing systems that should cost thousands of lindens but I give out free full perm. People deserve knowledge its in their nature and there is no reason that should be hidden from anyone.
  21. Hope more people submit stuff its cool seeing poses like LepreKhaun's
  22. Well, this project only uses 2 scripts, one is for the physics engine and the other is for the weapons and seating/doors ect. This was one of the greatest challenges of making the system in sl's environment. Due to sl's limitations it was challenging watching my script time and execution rates of the Physics engine. That being said it was a great opportunity to understand the programming environment that you are given and how it works. Due to sl's easy scripting system it was very easy and fast to rapidly prototype the code. Luckily sl is quite close to the way other engines handle the data types like quaternion rotations and vectors, ect. Being able to put together a build in sl and convert the code to C++, unreal script, torque script, ect is very handy. With most game engines you have to compile your code after testing it in a console typically or do a debug compile to be able to monitor different aspects of the engine. When dealing with new code environments its great to rely on Stackoverflow for help because they are some of the best people I've ever been able to talk to. Also if you are working on an engine specific environment it's typically good to talk on the forums of said engine. Like the Unreal forums or the CryTech forums depending on what engine you have chosen to suit your needs. I personally grew up working on the idTech 3 engine aka Quake 3 Arena's engine. After that I moved to Unreal Tournament's 2004 engine the Unreal 2 engine. From there I worked with modding communities as a volunteer creator as I was working on my "virtual sea legs" so to speak. The communities were very helpful and critical on how to create content for their engines/mods. We absolutely had to consider the FPS drop of mesh content inside the engines and on what platform we are developing for. With second life they are just catching up to current rendering standards so it's important to know that you need to keep your polycounts in a reasonable range. I tend to keep everything I do no larger than 12k and that's if its a cityscape or something very large. That spider is only about 10k tris and that was what I chose for the secondlife rendering engine. If you have ever wondered why the prims converted to mesh have such a high land impact it's because the server is estimating what the mesh surfaces should be considered using the display impact of the object. With that being said that means that prims should actually have a land impact of that much but they aren't because of the way sl was started. It really gives you a true perspective of how inefficient prim surfaces are in the sl world. When approaching a new game engine or world to work in it's important to study and research the object you are about to work on. When making my spider I looked into physics engines, Inverse Kinematics, Quaternion Constraints, and many other things that I thought would be helpful to the build. When working in an environment like the idTech engine they use matrix rotations instead of quats so you have to alter your mindset and learn the ways of matrix arrays. When working with the Torque engine you get access to quaternion algorithms to do lots of work for you but it becomes just as complex. Luckily since sl uses them the math is basically the same. Each engine has it's own toolset to work with and own methods of getting content into it. For this I highly recommend avoiding Blender at all costs. It is a CGI based modeling toolset and is rough at game modeling. It also has a very small toolset in the industry world. When working with a game design team the technical artist must be trained in writing tools for the modelers in the software package the team buys. Very few game design teams use blender in their pipeline so its recommended to know Maya or 3dsMax if you are looking at getting with a game company. Some indie game design teams do use blender but chances are the engine designers don't have any tools or packages available to import from blender so the team will cause themselves tons of extra effort using the blender suite. Since most professional level tools cost much money it is recommended to save up, use a trial, grab a student edition, or dun dun dun... find a cracked version to learn on.
  23. Haha not quite I've done 3D modeling work for about 11 years now with 3dsmax, zBrush and photoshop. I've worked on PC games for about 8 years now. Titles such as Killing Floor, Day-Z, Military Forces Quake 3, Quake 3 Rally, a couple of Indie Xbox 360 games and a few more mods for PC games like skyrim and what not . So coming to sl I really wanted to see what the scripting engine was capable of and this kinda gave me my answer . I didn't want to use the physics system at all and write them myself to get custom abalities of the vehicle and here we are
  24. Here is a little demo video of what is possible with him so far. This is an early Prototype stage of his current state. Hope you like it
  25. The problem is your objects sub level mesh is rotated in relation to its world orientation. To see what this is when using the rotation tool or move tool set the tool to use the local orientation by selecting and changing the View next to your Move Rotate Scale tools to Local. This will cause the Gizmos (movement tool) to orient its self locally to the mesh object. This will allow you to see what direction the sub-object mesh is orientated torwards.
×
×
  • Create New...