Jump to content

musical game


Kofiko
 Share

You are about to reply to a thread that has been inactive for 4668 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Hello,

I need some help and advice building a musical interactive game.

i will first explain what i need and i hope you guys can give me the best approach to do that .

the game will consist of one object that the avatar will interface with and can control several other object throw listen and talk(chanel) they will play a sound in turn .

I thought about a menu that you can press on but I think a keyboard press will be a better way but I don't know how to connect commands to keyboard press maybe one of you know?.

the primary object needs to remember all the different commands that it gave I thought about a list that will fill after every object has made is command it will send the command again throw talk and listen  to the primary object that will put it in the list.

After you stop pressing, the primary object will play all the sounds one after the other(from the list) but how can i do this without a lag?

Can somebody think of a better way to implement this?

 

 

 

Link to comment
Share on other sites

From this description, it's very difficult to tell what you are trying to do, but the basic question seems to be about how to process keyboard input.  You have a couple of options.  One is to open the public communication channel when you click on the object, then listen for chat input on that channel.  Communicating on the public channel is potentially laggy, but you won't create much of a problem if you are careful to keep it open only for the short time that it takes to type and receive a response.

Another possibility is to use a control event, interpreting the key presses on your arrow keys plus PgUp and PgDn as specific commands.  Normally, those would be commands to move the object, but there's no reason why they couldn't be commands to do other things. 

If you aren't limiting yourself to keyboard input, a common solution is to create a HUD interface on which you create buttons for various actions.  A mouse click on a button can mean whatever you want it to. 

If you can give a clearer description of the problem, perhaps we can offer more specific suggestions. Don't leap to a solution before you can explain where you want to go.

  • Like 1
Link to comment
Share on other sites

EDIT: I thought of linking every function to its wiki page... but I am too lazy. Here is the link to the LSL functions wiki.

To connect commands with the keyboard press, you must first know where you click.

If your keyboard is a single texture on a face of a prim, you can use llDetectedTouchST() or llDetectedTouchUV().

If your keyboard is made of several prims, llDetectedLinkNumber() will tell you which prim was clicked. If you use a single texture on the faces to click, you call also use llDetectedTouchUV() to find where you clicked in the texture without worrying about link number, but with some filtering on the face number obtained with llDetectedTouchFace(), since we can assume that only one face is clickable.

Once you know where you click, the work of the script will be to change that position into a single number. That can be the clicked link number.

In the case of the texture, if the texture can be divided into a regular grid, that's easy.

vector st = llDetectedTouchST(0);integer column = llFloor(st.x * WIDTH);integer line = llFloor(st.y * HEIGHT);integer command = (line * WIDTH) + column;

If you want to work with an irregular grid, WIDTH and HEIGHT can be the size of the picture so 'column' and 'line' are integer to speed up the comparisons with a list describing the clickable areas under the form as bottom left and top right corners. That comparison will give you an index. That is your command.

Using integer commands is the solution which will use the least memory if you want to store a list of them.

Now that we have commands, let's chat!

The rule of the "game" here is to narrow the listeners and to use llRegionSayTo() the most as possible.

First thing to do: Select a very negative channel.

Next, all the sub objects must learn the key of their master.

integer CHANNEL = -999666333;

integer Handle;
key MasterKey;

default
{
state_entry()
{
Handle = llListen(CHANNEL, "MasterName", NULL_KEY, 0);
}

listen(integer channel, string name, key id, string msg)
{
if (llGetOwnerKey(id) != llGetOwner() { return; } // Listen only to owner's objects.
//
if (MasterKey)
{
// Deal with the commands here...
}
else
{
MasterKey = id;
llListenRemove(Handle);
llListen(CHANNEL, "MasterName", MasterKey, "");
llRegionSayTo(MasterKey, CHANNEL, ""); // Chat to send own key.
llOwnerSay("Ready"); // DEBUG
}
}
}

With this, the sub objects will wait for the first command from their master to narrow their listener.

It is not possible to filter the master's listener the same way but the master can learn the keys of its subs. (If you are willing to give the same name to all the sub objects, you can filter on their name.)

For now, I will assume we have 5 sub objects will different names.

integer CHANNEL = -999666333;
integer SUB_ALL = -1;

list SubName = ["Sub0", "Sub1", "Sub2", "Sub3", "Sub4"];
list SubKey = [0, 0, 0, 0, 0, 0]; // One entry for each sub, plus one.

uuCommand(integer sub, integer command)
{
key id = llList2Key(SubKey, sub);
if (id)
{
llRegionSayTo(id, CHANNEL, (string)command);
}
else
{
llRegionSay(CHANNEL, (string)command);
}
}

default
{
state_entry()
{
llListen(CHANNEL, "", "", "");
}

touch_start(integer num)
{
integer sub;
integer command;
// Deal with the touches here...
//
//
// At this point, we must know to whom we are talking.
// It can be -1 to talk to all the subs at once.
uuCommand(sub, command);
}

listen(integer channel, string name, key id, string msg)
{
if (llGetOwnerKey(id) != llGetOwner() { return; } // Listen only to owner's objects.
//
if (llistFindList(SubKey, [id]) == -1) // Unknown key talking
{
integer i = llListFindList(SubName, [name]); // Is it a sub?
if (i != -1) // Yes!
{
SubKey = llListReplace(SubKey, [id], i, i); // Register it.
llOwnerSay("Got key for " + name); // DEBUG
}
}
}
}

(If you need to talk to all the subs at the same time all the time, a lot of that script is useless.)

This is pure theoretical work but it I am not too out of my mind, you should be able to reset or edit and save the script of any sub or the master at any time.

If you want more help, you will have to give more details, including unfinished or non-working scripts.

 

Link to comment
Share on other sites

I'm guessing that in this case the "keyboard" is intended to be that QWERTY thing beside your mouse.

In addition to taking controls as Rolig suggested (which responds to one restricted set of keys), another option might be to define a bunch of Gestures that use the function keys as shortcuts that produce unique chat messages on some standard channel on which your music script is listening.

It's still a (different) restricted set of keys, might interfere with other Gestures the user has loaded, and isn't very elegant to distribute, but I thought it might be relevant, depending on details of the application.

Link to comment
Share on other sites

Hi,

thanks for your replay,

I would like to explain my goal a little bit more.

I want to create an instrument that will be able to play individual notes by my keyboard.
8 notes, and a specific  key for each note by the keyboard.
I would like to create it for any avatar.

The Avatar will stand in the middle and with the keyboard keys let's say for example 1-9

will be able to play 1 for the note C, 2 for the note D and so on. 

Does it possible to be done? and what scripts do i need ?
If you think that your scripts are writeable please send it.

Thank you all for your help.

 

Link to comment
Share on other sites

Yes, of course it's possible, although maybe not in exactly the way you imagine. As people have explained in this thread, there's no easy way to  make a script respond immediately when you press on, say, the "A" key.  You may be better off just creating a set of gestures to do that instead of writing a script.  You will need to upload a set of individual *wav files to make up a full twelve-tone scale.  Be sure that you make them sound much louder in your editing program (WimAmp or whatever) than you want them to be in SL.Otherwise they will be disappointingly soft when you hear them in world.

If you are willing to give up the notion of playing notes from key touches, you have many options for writing a script to play them by clicking on different prims in a linkset or different parts of a single prim.  (By pure chance, I happen to have written one like that for a friend just before I left town for the holidays. If I remember when I return home in a week or so and can get in world to my inventory, I'll post it here.) You could also "pre-record" a tune by typing it out on the keyboard and then "playing it back" when you send the tune as a line in chat. I suggest that you take a look through the LSL Wiki at the descriptions of

llPlaySound

llGetLinkNumber

llGetLinkName

llDetectedTouchST

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4668 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...